877 resultados para Copper-based alloys
Resumo:
As a part of vital infrastructure and transportation networks, bridge structures must function safely at all times. However, due to heavier and faster moving vehicular loads and function adjustment, such as Busway accommodation, many bridges are now operating at an overload beyond their design capacity. Additionally, the huge renovation and replacement costs always make the infrastructure owners difficult to undertake. Structural health monitoring (SHM) is set to assess condition and foresee probable failures of designated bridge(s), so as to monitor the structural health of the bridges. The SHM systems proposed recently are incorporated with Vibration-Based Damage Detection (VBDD) techniques, Statistical Methods and Signal processing techniques and have been regarded as efficient and economical ways to solve the problem. The recent development in damage detection and condition assessment techniques based on VBDD and statistical methods are reviewed. The VBDD methods based on changes in natural frequencies, curvature/strain modes, modal strain energy (MSE) dynamic flexibility, artificial neural networks (ANN) before and after damage and other signal processing methods like Wavelet techniques and empirical mode decomposition (EMD) / Hilbert spectrum methods are discussed here.
Resumo:
Based on Newmark-β method, a structural vibration response is predicted. Through finding the appropriate control force parameters within certain ranges to optimize the objective function, the predictive control of the structural vibration is achieved. At the same time, the numerical simulation analysis of a two-storey frame structure with magneto-rheological (MR) dampers under earthquake records is carried out, and the parameter influence on structural vibration reduction is discussed. The results demonstrate that the semi-active control based on Newmark-β predictive algorithm is better than the classical control strategy based on full-state feedback control and has remarkable advantages of structural vibration reduction and control robustness.
Resumo:
All relevant international standards for determining if a metallic rod is flammable in oxygen utilize some form of “promoted ignition” test. In this test, for a given pressure, an overwhelming ignition source is coupled to the end of the test sample and the designation flammable or nonflammable is based upon the amount burned, that is, a burn criteria. It is documented that (1) the initial temperature of the test sample affects the burning of the test sample both (a) in regards to the pressure at which the sample will support burning (threshold pressure) and (b) the rate at which the sample is melted (regression rate of the melting interface); and, (2) the igniter used affects the test sample by heating it adjacent to the igniter as ignition occurs. Together, these facts make it necessary to ensure, if a metallic material is to be considered flammable at the conditions tested, that the burn criteria will exclude any region of the test sample that may have undergone preheating during the ignition process. A two-dimensional theoretical model was developed to describe the transient heat transfer occurring and resultant temperatures produced within this system. Several metals (copper, aluminum, iron, and stainless steel) and ignition promoters (magnesium, aluminum, and Pyrofuze®) were evaluated for a range of oxygen pressures between 0.69 MPa (100 psia) and 34.5 MPa (5,000 psia). A MATLAB® program was utilized to solve the developed model that was validated against (1) a published solution for a similar system and (2) against experimental data obtained during actual tests at the National Aeronautics and Space Administration White Sands Test Facility. The validated model successfully predicts temperatures within the test samples with agreement between model and experiment increasing as test pressure increases and/or distance from the promoter increases. Oxygen pressure and test sample thermal diffusivity were shown to have the largest effect on the results. In all cases evaluated, there is no significant preheating (above about 38°C/100°F) occurring at distances greater than 30 mm (1.18 in.) during the time the ignition source is attached to the test sample. This validates a distance of 30 mm (1.18 in.) above the ignition promoter as a burn length upon which a definition of flammable can be based for inclusion in relevant international standards (that is, burning past this length will always be independent of the ignition event for the ignition promoters considered here. KEYWORDS: promoted ignition, metal combustion, heat conduction, thin fin, promoted combustion, burn length, burn criteria, flammability, igniter effects, heat affected zone.
Resumo:
An earlier CRC-CI project on ‘automatic estimating’ (AE) has shown the key benefit of model-based design methodologies in building design and construction to be the provision of timely quantitative cost evaluations. Furthermore, using AE during design improves design options, and results in improved design turn-around times, better design quality and/or lower costs. However, AEs for civil engineering structures do not exist; and research partners in the CRC-CI expressed interest in exploring the development of such a process. This document reports on these investigations. The central objective of the study was to evaluate the benefits and costs of developing an AE for concrete civil engineering works. By studying existing documents and through interviews with design engineers, contractors and estimators, we have established that current civil engineering practices (mainly roads/bridges) do not use model-based planning/design. Drawings are executed in 2D and only completed at the end of lengthy planning/design project management lifecycle stages. We have also determined that estimating plays two important, but different roles. The first is part of project management (which we have called macro level estimating). Estimating in this domain sets project budgets, controls quality delivery and contains costs. The second role is estimating during planning/design (micro level estimating). The difference between the two roles is that the former is performed at the end of various lifecycle stages, whereas the latter is performed at any suitable time during planning/design.
Resumo:
Australia’s civil infrastructure assets of roads, bridges, railways, buildings and other structures are worth billions of dollars. Road assets alone are valued at around A$ 140 billion. As the condition of assets deteriorate over time, close to A$10 billion is spent annually in asset maintenance on Australia's roads, or the equivalent of A$27 million per day. To effectively manage road infrastructures, firstly, road agencies need to optimise the expenditure for asset data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. A procedure for assessing investment decision for road asset management has been developed. The procedure includes: • A methodology for optimising asset data collection; • A methodology for calibrating deterioration prediction models; • A methodology for assessing risk-adjusted estimates for life-cycle cost estimates. • A decision framework in the form of risk map
Resumo:
Under the Alien Tort Statute United States of America (“America”) Federal Courts have the jurisdiction to hear claims for civil wrongs, committed against non-American citizens, which were perpetrated outside America’s national borders. The operation of this law has confronted American Federal Courts with difficulties on how to manage conflicts between American executive foreign policy and judicial interpretations of international law. Courts began to pass judgment over conduct which was approved by foreign governments. Then in 2005 the American Supreme Court wound back the scope of the Alien Tort Statute. This article will review the problems with the expansion of the Alien Tort Statute and the reasons for its subsequent narrowing.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
Literature addressing methodological issues in organisational research is extensive and multidisciplinary, encompassing debates about methodological choices, data-collection techniques, epistemological approaches and statistical procedures. However, little scholarship has tackled an important aspect of organisational research that precedes decisions about data collection and analysis – access to the organisations themselves, including the people, processes and documents within them. This chapter looks at organisational access through the experiences of three research fellows in the course of their work with their respective industry partners. In doing so, it reveals many of the challenges and changing opportunities associated with access to organisations, which are rarely explicitly addressed, but often assumed, in traditional methods texts and journal publications. Although the level of access granted varied somewhat across the projects at different points in time and according to different organisational contexts, we shared a number of core and consistent experiences in attempting to collect data and implement strategies.
Resumo:
Realistic estimates of short- and long-term (strategic) budgets for maintenance and rehabilitation of road assessment management should consider the stochastic characteristics of asset conditions of the road networks so that the overall variability of road asset data conditions is taken into account. The probability theory has been used for assessing life-cycle costs for bridge infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick (1993). Salem 2003 cited the importance of the collection and analysis of existing data on total costs for all life-cycle phases of existing infrastructure, including bridges, road etc., and the use of realistic methods for calculating the probable useful life of these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting results in life-cycle cost analysis using deterministic and stochastic methods. Frangopol et. al. 2001 suggested that additional research was required to develop better life-cycle models and tools to quantify risks, and benefits associated with infrastructures. It is evident from the review of the literature that there is very limited information on the methodology that uses the stochastic characteristics of asset condition data for assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002, Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research literature, this report will describe and summarise the methodologies presented by each publication and also suggest a methodology for the current research project funded under the Cooperative Research Centre for Construction Innovation CRC CI project no 2003-029-C.
Resumo:
In this paper, the stability of an autonomous microgrid with multiple distributed generators (DG) is studied through eigenvalue analysis. It is assumed that all the DGs are connected through Voltage Source Converter (VSC) and all connected loads are passive. The VSCs are controlled by state feedback controller to achieve desired voltage and current outputs that are decided by a droop controller. The state space models of each of the converters with its associated feedback are derived. These are then connected with the state space models of the droop, network and loads to form a homogeneous model, through which the eigenvalues are evaluated. The system stability is then investigated as a function of the droop controller real and reac-tive power coefficients. These observations are then verified through simulation studies using PSCAD/EMTDC. It will be shown that the simulation results closely agree with stability be-havior predicted by the eigenvalue analysis.
Resumo:
This paper describes the operation of a microgrid that contains a custom power park (CPP). The park may contain an unbalanced and/or nonlinear load and the microgrid may contain many dis-tributed generators (DGs). One of the DGs in the microgrid is used as a compensator to achieve load compensation. A new method is proposed for current reference generation for load compensation, which takes into account the real and reactive power to be supplied by the DG connected to the compensator. The real and reactive power from the DGs and the utility source is tightly regulated assuming that dedicated communication channels are available. Therefore this scheme is most suitable in cases where the loads in CPP and DGs are physically located close to each other. The proposal is validated through extensive simulation studies using EMTDC/PSCAD software package (version 4.2).