908 resultados para Belief-Based Targets


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on Newmark-β method, a structural vibration response is predicted. Through finding the appropriate control force parameters within certain ranges to optimize the objective function, the predictive control of the structural vibration is achieved. At the same time, the numerical simulation analysis of a two-storey frame structure with magneto-rheological (MR) dampers under earthquake records is carried out, and the parameter influence on structural vibration reduction is discussed. The results demonstrate that the semi-active control based on Newmark-β predictive algorithm is better than the classical control strategy based on full-state feedback control and has remarkable advantages of structural vibration reduction and control robustness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An earlier CRC-CI project on ‘automatic estimating’ (AE) has shown the key benefit of model-based design methodologies in building design and construction to be the provision of timely quantitative cost evaluations. Furthermore, using AE during design improves design options, and results in improved design turn-around times, better design quality and/or lower costs. However, AEs for civil engineering structures do not exist; and research partners in the CRC-CI expressed interest in exploring the development of such a process. This document reports on these investigations. The central objective of the study was to evaluate the benefits and costs of developing an AE for concrete civil engineering works. By studying existing documents and through interviews with design engineers, contractors and estimators, we have established that current civil engineering practices (mainly roads/bridges) do not use model-based planning/design. Drawings are executed in 2D and only completed at the end of lengthy planning/design project management lifecycle stages. We have also determined that estimating plays two important, but different roles. The first is part of project management (which we have called macro level estimating). Estimating in this domain sets project budgets, controls quality delivery and contains costs. The second role is estimating during planning/design (micro level estimating). The difference between the two roles is that the former is performed at the end of various lifecycle stages, whereas the latter is performed at any suitable time during planning/design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australia’s civil infrastructure assets of roads, bridges, railways, buildings and other structures are worth billions of dollars. Road assets alone are valued at around A$ 140 billion. As the condition of assets deteriorate over time, close to A$10 billion is spent annually in asset maintenance on Australia's roads, or the equivalent of A$27 million per day. To effectively manage road infrastructures, firstly, road agencies need to optimise the expenditure for asset data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. A procedure for assessing investment decision for road asset management has been developed. The procedure includes: • A methodology for optimising asset data collection; • A methodology for calibrating deterioration prediction models; • A methodology for assessing risk-adjusted estimates for life-cycle cost estimates. • A decision framework in the form of risk map

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Under the Alien Tort Statute United States of America (“America”) Federal Courts have the jurisdiction to hear claims for civil wrongs, committed against non-American citizens, which were perpetrated outside America’s national borders. The operation of this law has confronted American Federal Courts with difficulties on how to manage conflicts between American executive foreign policy and judicial interpretations of international law. Courts began to pass judgment over conduct which was approved by foreign governments. Then in 2005 the American Supreme Court wound back the scope of the Alien Tort Statute. This article will review the problems with the expansion of the Alien Tort Statute and the reasons for its subsequent narrowing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Principal Topic The Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE) represents the first Australian study to employ and extend the longitudinal and large scale systematic research developed for the Panel Study of Entrepreneurial Dynamics (PSED) in the US (Gartner, Shaver, Carter and Reynolds, 2004; Reynolds, 2007). This research approach addresses several shortcomings of other data sets including under coverage; selection bias; memory decay and hindsight bias, and lack of time separation between the assessment of causes and their assumed effects (Johnson et al 2006; Davidsson 2006). However, a remaining problem is that any a random sample of start-ups will be dominated by low potential, imitative ventures. In recognition of this issue CAUSEE supplemented PSED-type random samples with theoretically representative samples of the 'high potential' emerging ventures employing a unique methodology using novel multiple screening criteria. We define new ''high-potential'' ventures as new entrepreneurial innovative ventures with high aspirations and potential for growth. This distinguishes them from those ''lifestyle'' imitative businesses that start small and remain intentionally small (Timmons, 1986). CAUSEE is providing the opportunity to explore, for the first time, if process and outcomes of high potentials differ from those of traditional lifestyle firms. This will allows us to compare process and outcome attributes of the random sample with the high potential over sample of new firms and young firms. The attributes in which we will examine potential differences will include source of funding, and internationalisation. This is interesting both in terms of helping to explain why different outcomes occur but also in terms of assistance to future policymaking, given that high growth potential firms are increasingly becoming the focus of government intervention in economic development policies around the world. The first wave of data of a four year longitudinal study has been collected using these samples, allowing us to also provide some initial analysis on which to continue further research. The aim of this paper therefore is to present some selected preliminary results from the first wave of the data collection, with comparisons of high potential with lifestyle firms. We expect to see owing to greater resource requirements and higher risk profiles, more use of venture capital and angel investment, and more internationalisation activity to assist in recouping investment and to overcome Australia's smaller economic markets Methodology/Key Propositions In order to develop the samples of 'high potential' in the NF and YF categories a set of qualification criteria were developed. Specifically, to qualify, firms as nascent or young high potentials, we used multiple, partly compensating screening criteria related to the human capital and aspirations of the founders as well as the novelty of the venture idea, and venture high technology. A variety of techniques were also employed to develop a multi level dataset of sources to develop leads and firm details. A dataset was generated from a variety of websites including major stakeholders including the Federal and State Governments, Australian Chamber of Commerce, University Commercialisation Offices, Patent and Trademark Attorneys, Government Awards and Industry Awards in Entrepreneurship and Innovation, Industry lead associations, Venture Capital Association, Innovation directories including Australian Technology Showcase, Business and Entrepreneurs Magazines including BRW and Anthill. In total, over 480 industry, association, government and award sources were generated in this process. Of these, 74 discrete sources generated high potentials that fufilled the criteria. 1116 firms were contacted as high potential cases. 331 cases agreed to participate in the screener, with 279 firms (134 nascents, and 140 young firms) successfully passing the high potential criteria. 222 Firms (108 Nascents and 113 Young firms) completed the full interview. For the general sample CAUSEE conducts screening phone interviews with a very large number of adult members of households randomly selected through random digit dialing using screening questions which determine whether respondents qualify as 'nascent entrepreneurs'. CAUSEE additionally targets 'young firms' those that commenced trading from 2004 or later. This process yielded 977 Nascent Firms (3.4%) and 1,011 Young Firms (3.6%). These were directed to the full length interview (40-60 minutes) either directly following the screener or later by appointment. The full length interviews were completed by 594 NF and 514 YF cases. These are the cases we will use in the comparative analysis in this report. Results and Implications The results for this paper are based on Wave one of the survey which has been completed and the data obtained. It is expected that the findings will assist in beginning to develop an understanding of high potential nascent and young firms in Australia, how they differ from the larger lifestyle entrepreneur group that makes up the vast majority of the new firms created each year, and the elements that may contribute to turning high potential growth status into high growth realities. The results have implications for Government in the design of better conditions for the creation of new business, firms who assist high potentials in developing better advice programs in line with a better understanding of their needs and requirements, individuals who may be considering becoming entrepreneurs in high potential arenas and existing entrepreneurs make better decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Literature addressing methodological issues in organisational research is extensive and multidisciplinary, encompassing debates about methodological choices, data-collection techniques, epistemological approaches and statistical procedures. However, little scholarship has tackled an important aspect of organisational research that precedes decisions about data collection and analysis – access to the organisations themselves, including the people, processes and documents within them. This chapter looks at organisational access through the experiences of three research fellows in the course of their work with their respective industry partners. In doing so, it reveals many of the challenges and changing opportunities associated with access to organisations, which are rarely explicitly addressed, but often assumed, in traditional methods texts and journal publications. Although the level of access granted varied somewhat across the projects at different points in time and according to different organisational contexts, we shared a number of core and consistent experiences in attempting to collect data and implement strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Realistic estimates of short- and long-term (strategic) budgets for maintenance and rehabilitation of road assessment management should consider the stochastic characteristics of asset conditions of the road networks so that the overall variability of road asset data conditions is taken into account. The probability theory has been used for assessing life-cycle costs for bridge infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick (1993). Salem 2003 cited the importance of the collection and analysis of existing data on total costs for all life-cycle phases of existing infrastructure, including bridges, road etc., and the use of realistic methods for calculating the probable useful life of these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting results in life-cycle cost analysis using deterministic and stochastic methods. Frangopol et. al. 2001 suggested that additional research was required to develop better life-cycle models and tools to quantify risks, and benefits associated with infrastructures. It is evident from the review of the literature that there is very limited information on the methodology that uses the stochastic characteristics of asset condition data for assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002, Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research literature, this report will describe and summarise the methodologies presented by each publication and also suggest a methodology for the current research project funded under the Cooperative Research Centre for Construction Innovation CRC CI project no 2003-029-C.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the stability of an autonomous microgrid with multiple distributed generators (DG) is studied through eigenvalue analysis. It is assumed that all the DGs are connected through Voltage Source Converter (VSC) and all connected loads are passive. The VSCs are controlled by state feedback controller to achieve desired voltage and current outputs that are decided by a droop controller. The state space models of each of the converters with its associated feedback are derived. These are then connected with the state space models of the droop, network and loads to form a homogeneous model, through which the eigenvalues are evaluated. The system stability is then investigated as a function of the droop controller real and reac-tive power coefficients. These observations are then verified through simulation studies using PSCAD/EMTDC. It will be shown that the simulation results closely agree with stability be-havior predicted by the eigenvalue analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the operation of a microgrid that contains a custom power park (CPP). The park may contain an unbalanced and/or nonlinear load and the microgrid may contain many dis-tributed generators (DGs). One of the DGs in the microgrid is used as a compensator to achieve load compensation. A new method is proposed for current reference generation for load compensation, which takes into account the real and reactive power to be supplied by the DG connected to the compensator. The real and reactive power from the DGs and the utility source is tightly regulated assuming that dedicated communication channels are available. Therefore this scheme is most suitable in cases where the loads in CPP and DGs are physically located close to each other. The proposal is validated through extensive simulation studies using EMTDC/PSCAD software package (version 4.2).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing and estimating civil concrete structures is a complex process which to many practitioners is tied to manual or semi-manual processes of 2D design and cannot be further improved by automated, interacting design-estimating processes. This paper presents a feasibility study for the development an automated estimator for concrete bridge design. The study offers a value proposition: an efficient automated model-based estimator can add value to the whole bridge design-estimating process, i.e., reducing estimation errors, shortening the duration of success estimates, and increasing the benefit of doing cost estimation when compared with the current practice. This is then followed by a description of what is in an efficient automated model-based estimator and how it should be used. Finally the process of model-based estimating is compared with the current practice to highlight the values embedded in the automated processes.