399 resultados para Bergeron line model
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
In architecture courses, instilling a wider understanding of the industry specific representations practiced in the Building Industry is normally done under the auspices of Technology and Science subjects. Traditionally, building industry professionals communicated their design intentions using industry specific representations. Originally these mainly two dimensional representations such as plans, sections, elevations, schedules, etc. were produced manually, using a drawing board. Currently, this manual process has been digitised in the form of Computer Aided Design and Drafting (CADD) or ubiquitously simply CAD. While CAD has significant productivity and accuracy advantages over the earlier manual method, it still only produces industry specific representations of the design intent. Essentially, CAD is a digital version of the drawing board. The tool used for the production of these representations in industry is still mainly CAD. This is also the approach taken in most traditional university courses and mirrors the reality of the situation in the building industry. A successor to CAD, in the form of Building Information Modelling (BIM), is presently evolving in the Construction Industry. CAD is mostly a technical tool that conforms to existing industry practices. BIM on the other hand is revolutionary both as a technical tool and as an industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team. Essentially, BIM builds any building twice: once in the virtual world, where any faults are resolved, and finally, in the real world. There is, however, no established model for learning through the use of this technology in Architecture courses. Queensland University of Technology (QUT), a tertiary institution that maintains close links with industry, recognises the importance of equipping their graduates with skills that are relevant to industry. BIM skills are currently in increasing demand throughout the construction industry through the evolution of construction industry practices. As such, during the second half of 2008, QUT 4th year architectural students were formally introduced for the first time to BIM, as both a technology and as an industry practice. This paper will outline the teaching team’s experiences and methodologies in offering a BIM unit (Architectural Technology and Science IV) at QUT for the first time and provide a description of the learning model. The paper will present the results of a survey on the learners’ perspectives of both BIM and their learning experiences as they learn about and through this technology.
Resumo:
Recent decisions of the Family Court of Australian reflect concerns over the adversarial nature of the legal process. The processes and procedures of the judicial system militate against a detailed examination of the issues and rights of the parties in dispute. The limitations of the family law framework are particularly demonstrated in disputes over the custody of children where the Court has tended to neglect the rights and interests of the primary carer. An alternative "unified family court" framework will be examined in which the Court pursues a more active and interventionist approach in the determination of family law disputes.
Resumo:
It has previously been found that complexes comprised of vitronectin and growth factors (VN:GF) enhance keratinocyte protein synthesis and migration. More specifically, these complexes have been shown to significantly enhance the migration of dermal keratinocytes derived from human skin. In view of this, it was thought that these complexes may hold potential as a novel therapy for healing chronic wounds. However, there was no evidence indicating that the VN:GF complexes would retain their effect on keratinocytes in the presence of chronic wound fluid. The studies in this thesis demonstrate for the first time that the VN:GF complexes not only stimulate proliferation and migration of keratinocytes, but also these effects are maintained in the presence of chronic wound fluid in a 2-dimensional (2-D) cell culture model. Whilst the 2-D culture system provided insights into how the cells might respond to the VN:GF complexes, this investigative approach is not ideal as skin is a 3-dimensional (3-D) tissue. In view of this, a 3-D human skin equivalent (HSE) model, which reflects more closely the in vivo environment, was used to test the VN:GF complexes on epidermopoiesis. These studies revealed that the VN:GF complexes enable keratinocytes to migrate, proliferate and differentiate on a de-epidermalised dermis (DED), ultimately forming a fully stratified epidermis. In addition, fibroblasts were seeded on DED and shown to migrate into the DED in the presence of the VN:GF complexes and hyaluronic acid, another important biological factor in the wound healing cascade. This HSE model was then further developed to enable studies examining the potential of the VN:GF complexes in epidermal wound healing. Specifically, a reproducible partial-thickness HSE wound model was created in fully-defined media and monitored as it healed. In this situation, the VN:GF complexes were shown to significantly enhance keratinocyte migration and proliferation, as well as differentiation. This model was also subsequently utilized to assess the wound healing potential of a synthetic fibrin-like gel that had previously been demonstrated to bind growth factors. Of note, keratinocyte re-epitheliasation was shown to be markedly improved in the presence of this 3-D matrix, highlighting its future potential for use as a delivery vehicle for the VN:GF complexes. Furthermore, this synthetic fibrin-like gel was injected into a 4 mm diameter full-thickness wound created in the HSE, both keratinocytes and fibroblasts were shown to migrate into this gel, as revealed by immunofluorescence. Interestingly, keratinocyte migration into this matrix was found to be dependent upon the presence of the fibroblasts. Taken together, these data indicate that reproducible wounds, as created in the HSEs, provide a relevant ex vivo tool to assess potential wound healing therapies. Moreover, the models will decrease our reliance on animals for scientific experimentation. Additionally, it is clear that these models will significantly assist in the development of novel treatments, such as the VN:GF complexes and the synthetic fibrin-like gel described herein, ultimately facilitating their clinical trial in the treatment of chronic wounds.
Resumo:
Chronic wounds are a significant socioeconomic problem for governments worldwide. Approximately 15% of people who suffer from diabetes will experience a lower-limb ulcer at some stage of their lives, and 24% of these wounds will ultimately result in amputation of the lower limb. Hyperbaric Oxygen Therapy (HBOT) has been shown to aid the healing of chronic wounds; however, the causal reasons for the improved healing remain unclear and hence current HBOT protocols remain empirical. Here we develop a three-species mathematical model of wound healing that is used to simulate the application of hyperbaric oxygen therapy in the treatment of wounds. Based on our modelling, we predict that intermittent HBOT will assist chronic wound healing while normobaric oxygen is ineffective in treating such wounds. Furthermore, treatment should continue until healing is complete, and HBOT will not stimulate healing under all circumstances, leading us to conclude that finding the right protocol for an individual patient is crucial if HBOT is to be effective. We provide constraints that depend on the model parameters for the range of HBOT protocols that will stimulate healing. More specifically, we predict that patients with a poor arterial supply of oxygen, high consumption of oxygen by the wound tissue, chronically hypoxic wounds, and/or a dysfunctional endothelial cell response to oxygen are at risk of nonresponsiveness to HBOT. The work of this paper can, in some way, highlight which patients are most likely to respond well to HBOT (for example, those with a good arterial supply), and thus has the potential to assist in improving both the success rate and hence the costeffectiveness of this therapy.
Resumo:
The research presented in this thesis addresses inherent problems in signaturebased intrusion detection systems (IDSs) operating in heterogeneous environments. The research proposes a solution to address the difficulties associated with multistep attack scenario specification and detection for such environments. The research has focused on two distinct problems: the representation of events derived from heterogeneous sources and multi-step attack specification and detection. The first part of the research investigates the application of an event abstraction model to event logs collected from a heterogeneous environment. The event abstraction model comprises a hierarchy of events derived from different log sources such as system audit data, application logs, captured network traffic, and intrusion detection system alerts. Unlike existing event abstraction models where low-level information may be discarded during the abstraction process, the event abstraction model presented in this work preserves all low-level information as well as providing high-level information in the form of abstract events. The event abstraction model presented in this work was designed independently of any particular IDS and thus may be used by any IDS, intrusion forensic tools, or monitoring tools. The second part of the research investigates the use of unification for multi-step attack scenario specification and detection. Multi-step attack scenarios are hard to specify and detect as they often involve the correlation of events from multiple sources which may be affected by time uncertainty. The unification algorithm provides a simple and straightforward scenario matching mechanism by using variable instantiation where variables represent events as defined in the event abstraction model. The third part of the research looks into the solution to address time uncertainty. Clock synchronisation is crucial for detecting multi-step attack scenarios which involve logs from multiple hosts. Issues involving time uncertainty have been largely neglected by intrusion detection research. The system presented in this research introduces two techniques for addressing time uncertainty issues: clock skew compensation and clock drift modelling using linear regression. An off-line IDS prototype for detecting multi-step attacks has been implemented. The prototype comprises two modules: implementation of the abstract event system architecture (AESA) and of the scenario detection module. The scenario detection module implements our signature language developed based on the Python programming language syntax and the unification-based scenario detection engine. The prototype has been evaluated using a publicly available dataset of real attack traffic and event logs and a synthetic dataset. The distinct features of the public dataset are the fact that it contains multi-step attacks which involve multiple hosts with clock skew and clock drift. These features allow us to demonstrate the application and the advantages of the contributions of this research. All instances of multi-step attacks in the dataset have been correctly identified even though there exists a significant clock skew and drift in the dataset. Future work identified by this research would be to develop a refined unification algorithm suitable for processing streams of events to enable an on-line detection. In terms of time uncertainty, identified future work would be to develop mechanisms which allows automatic clock skew and clock drift identification and correction. The immediate application of the research presented in this thesis is the framework of an off-line IDS which processes events from heterogeneous sources using abstraction and which can detect multi-step attack scenarios which may involve time uncertainty.
Resumo:
The weaknesses of ‗traditional‘ modes of instruction in accounting education have been widely discussed. Many contend that the traditional approach limits the ability to provide opportunities for students to raise their competency level and allow them to apply knowledge and skills in professional problem solving situations. However, the recent body of literature suggests that accounting educators are indeed actively experimenting with ‗non-traditional‘ and ‗innovative‘ instructional approaches, where some authors clearly favour one approach over another. But can one instructional approach alone meet the necessary conditions for different learning objectives? Taking into account the ever changing landscape of not only business environments, but also the higher education sector, the premise guiding the collaborators in this research is that it is perhaps counter productive to promote competing dichotomous views of ‗traditional‘ and ‗non-traditional‘ instructional approaches to accounting education, and that the notion of ‗blended learning‘ might provide a useful framework to enhance the learning and teaching of accounting. This paper reports on the first cycle of a longitudinal study, which explores the possibility of using blended learning in first year accounting at one campus of a large regional university. The critical elements of blended learning which emerged in the study are discussed and, consistent with the design-based research framework, the paper also identifies key design modifications for successive cycles of the research.
Resumo:
An earlier CRC-CI project on ‘automatic estimating’ (AE) has shown the key benefit of model-based design methodologies in building design and construction to be the provision of timely quantitative cost evaluations. Furthermore, using AE during design improves design options, and results in improved design turn-around times, better design quality and/or lower costs. However, AEs for civil engineering structures do not exist; and research partners in the CRC-CI expressed interest in exploring the development of such a process. This document reports on these investigations. The central objective of the study was to evaluate the benefits and costs of developing an AE for concrete civil engineering works. By studying existing documents and through interviews with design engineers, contractors and estimators, we have established that current civil engineering practices (mainly roads/bridges) do not use model-based planning/design. Drawings are executed in 2D and only completed at the end of lengthy planning/design project management lifecycle stages. We have also determined that estimating plays two important, but different roles. The first is part of project management (which we have called macro level estimating). Estimating in this domain sets project budgets, controls quality delivery and contains costs. The second role is estimating during planning/design (micro level estimating). The difference between the two roles is that the former is performed at the end of various lifecycle stages, whereas the latter is performed at any suitable time during planning/design.
Resumo:
This document provides the findings of an international review of investment decision-making practices in road asset management. Efforts were concentrated on identifying the strategic objectives of agencies in road asset management, establishing and understanding criteria different organisations adopted and ascertaining the exact methodologies used by different countries and international organisations. Road assets are powerful drivers of economic development and social equity. They also have significant impacts on the natural and man-made environment. The traditional definition of asset management is “A systematic process of maintaining, upgrading and operating physical assets cost effectively. It combines engineering principles with sound business practices and economic theory and it provides tools to facilitate a more organised, logical approach to decision-making” (US Dept. of Transportation, 1999). In recent years, the concept has been broadened to cover the complexity of decision making, based on a wider variety of policy considerations as well as social and environmental issues rather than is covered by Benefit-Cost analysis and pure technical considerations. Current international practices are summarised in table 2. It was evident that Engineering-economic analysis methods are well advanced to support decision-making. A range of tools available supports performance predicting of road assets and associated cost/benefit in technical context. The need for considering triple plus one bottom line of social, environmental and economic as well as political factors in decision-making is well understood by road agencies around the world. The techniques used to incorporate these however, are limited. Most countries adopt a scoring method, a goal achievement matrix or information collected from surveys. The greater uncertainty associated with these non-quantitative factors has generally not been taken into consideration. There is a gap between the capacities of the decision-making support systems and the requirements from decision-makers to make more rational and transparent decisions. The challenges faced in developing an integrated decision making framework are both procedural and conceptual. In operational terms, the framework should be easy to be understood and employed. In philosophical terms, the framework should be able to deal with challenging issues, such as uncertainty, time frame, network effects, model changes, while integrating cost and non-cost values into the evaluation. The choice of evaluation techniques depends on the feature of the problem at hand, on the aims of the analysis, and on the underlying information base At different management levels, the complexity in considering social, environmental, economic and political factor in decision-making is different. At higher the strategic planning level, more non-cost factors are involved. The complexity also varies based on the scope of the investment proposals. Road agencies traditionally place less emphasis on evaluation of maintenance works. In some cases, social equity, safety, environmental issues have been used in maintenance project selection. However, there is not a common base for the applications.
Resumo:
This report fully summarises a project designed to enhance commercial real estate performance within both operational and investment contexts through the development of a model aimed at supporting improved decision-making. The model is based on a risk adjusted discounted cash flow, providing a valuable toolkit for building managers, owners, and potential investors for evaluating individual building performance in terms of financial, social and environmental criteria over the complete life-cycle of the asset. The ‘triple bottom line’ approach to the evaluation of commercial property has much significance for the administrators of public property portfolios in particular. It also has applications more generally for the wider real estate industry given that the advent of ‘green’ construction requires new methods for evaluating both new and existing building stocks. The research is unique in that it focuses on the accuracy of the input variables required for the model. These key variables were largely determined by market-based research and an extensive literature review, and have been fine-tuned with extensive testing. In essence, the project has considered probability-based risk analysis techniques that required market-based assessment. The projections listed in the partner engineers’ building audit reports of the four case study buildings were fed into the property evaluation model developed by the research team. The results are strongly consistent with previously existing, less robust evaluation techniques. And importantly, this model pioneers an approach for taking full account of the triple bottom line, establishing a benchmark for related research to follow. The project’s industry partners expressed a high degree of satisfaction with the project outcomes at a recent demonstration seminar. The project in its existing form has not been geared towards commercial applications but it is anticipated that QDPW and other industry partners will benefit greatly by using this tool for the performance evaluation of property assets. The project met the objectives of the original proposal as well as all the specified milestones. The project has been completed within budget and on time. This research project has achieved the objective by establishing research foci on the model structure, the key input variable identification, the drivers of the relevant property markets, the determinants of the key variables (Research Engine no.1), the examination of risk measurement, the incorporation of risk simulation exercises (Research Engine no.2), the importance of both environmental and social factors and, finally the impact of the triple bottom line measures on the asset (Research Engine no. 3).
Resumo:
Principal Topic Small and micro-enterprises are believed to play a significant part in economic growth and poverty allevition in developing countries. However, there are a range of issues that arise when looking at the support required for local enterprise development, the role of micro finance and sustainability. This paper explores the issues associated with the establishment and resourcing of micro-enterprise develoment and proposes a model of sustainable support of enterprise development in very poor developing economies, particularly in Africa. The purpose of this paper is to identify and address the range of issues raised by the literature and empirical research in Africa, regarding micro-finance and small business support, and to develop a model for sustainable support for enterprise development within a particular cultural and economic context. Micro-finance has become big business with a range of models - from those that operate on a strictly business basis to those that come from a philanthropic base. The models used grow from a range of philosophical and cultural perspectives. Entrepreneurship training is provided around the world. Success is often measured by the number involved and the repayment rates - which are very high, largely because of the lending models used. This paper will explore the range of options available and propose a model that can be implemented and evaluated in rapidly changing developing economies. Methodology/Key Propositions The research draws on entrepreneurial and micro-finance literature and empirical research undertaken in Mozambique, which lies along the Indian ocean sea border of Southern Africa. As a result of war and natural disasters over a prolonged period, there is little industry, primary industries are primitive and there is virtually no infrastructure. Mozambique is ranked as one of the poorest countries in the world. The conditions in Mozambique, though not identical, reflect conditions in many other parts of Africa. A numebr of key elements in the development of enterprises in poor countries are explored including: Impact of micro-finance Sustainable models of micro-finance Education and training Capacity building Support mechanisms Impact on poverty, families and the local economy Survival entrepreneurship versus growth entrepreneurship Transitions to the formal sector. Results and Implications The result of this study is the development of a model for providing intellectual and financial resources to micro-entrepreneurs in poor developing countries in a sustainable way. The model provides a base for ongoing research into the process of entrepreneurial growth in African developing economies. The research raises a numeber of issues regarding sustainability including the nature of the donor/recipient relationship, access to affordable resources, the impact of individual entrepreneurial activity on the local economny and the need for ongoing research to understand the whole process and its impact, intended and unintended.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
Principal Topic The study of the origin and characteristics of venture ideas - or ''opportunities'' as they are often called - and their contextual fit are key research goals in entrepreneurship (Davidsson, 2004). We define venture idea as ''the core ideas of an entrepreneur about what to sell, how to sell, whom to sell and how an entrepreneur acquire or produce the product or service which he/she sells'' for the purpose of this study. When realized the venture idea becomes a ''business model''. Even though venture ideas are central to entrepreneurship yet its characteristics and their effect to the entrepreneurial process is mysterious. According to Schumpeter (1934) entrepreneurs could creatively destruct the existing market condition by introducing new product/service, new production methods, new markets, and new sources of supply and reorganization of industries. The introduction, development and use of new ideas are generally called as ''innovation'' (Damanpour & Wischnevsky, 2006) and ''newness'' is a property of innovation and is a relative term which means that the degree of unfamiliarity of venture idea either to a firm or to a market. However Schumpeter's (1934) discusses five different types of newness, indicating that type of newness is an important issue. More recently, Shane and Venkataraman (2000) called for research taking into consideration not only the variation of characteristics of individuals but also heterogeneity of venture ideas, Empirically, Samuelson (2001, 2004) investigated process differences between innovative venture ideas and imitative venture ideas. However, he used only a crude dichotomy regarding the venture idea newness. According to Davidsson, (2004) as entrepreneurs could introduce new economic activities ranging from pure imitation to being new to the entire world market, highlighting that newness is a matter of degree. Dahlqvist (2007) examined the venture idea newness and made and attempt at more refined assessment of the degree and type of newness of venture idea. Building on these predecessors our study refines the assessment of venture idea newness by measuring the degree of venture idea newness (new to the world, new to the market, substantially improved while not entirely new, and imitation) for four different types of newness (product/service, method of production, method of promotion, and customer/target market). We then related type and degree of newness to the pace of progress in nascent venturing process. We hypothesize that newness will slow down the business creation process. Shane & Venkataraman (2000) introduced entrepreneurship as the nexus of opportunities and individuals. In line with this some scholars has investigated the relationship between individuals and opportunities. For example Shane (2000) investigates the relatedness between individuals' prior knowledge and identification of opportunities. Shepherd & DeTinne (2005) identified that there is a positive relationship between potential financial reward and the identification of innovative venture ideas. Sarasvathy's 'Effectuation Theory'' assumes high degree of relatedness with founders' skills, knowledge and resources in the selection of venture ideas. However entrepreneurship literature is scant with analyses of how this relatedness affects to the progress of venturing process. Therefore, we assess the venture ideas' degree of relatedness to prior knowledge and resources, and relate these, too, to the pace of progress in nascent venturing process. We hypothesize that relatedness will increase the speed of business creation. Methodology For this study we will compare early findings from data collected through the Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE). CAUSEE is a longitudinal study whose primary objective is to uncover the factors that initiate, hinder and facilitate the process of emergence and development of new firms. Data were collected from a representative sample of some 30,000 households in Australia using random digit dialing (RDD) telephone survey interviews. Through the first round of data collection identified 600 entrepreneurs who are currently involved in the business start-up process. The unit of the analysis is the emerging venture, with the respondent acting as its spokesperson. The study methodology allows researchers to identify ventures in early stages of creation and to longitudinally follow their progression through data collection periods over time. Our measures of newness build on previous work by Dahlqvist (2007). Our adapted version was developed over two pre-tests with about 80 participants in each. The measures of relatedness were developed through the two rounds of pre-testing. The pace of progress in the venture creation process is assessed with the help of time-stamped gestation activities; a technique developed in the Panel Study of Entrepreneurial Dynamics (PSED). Results and Implications We hypothesized that venture idea newness slows down the venturing process whereas relatedness facilitates the venturing process. Results of 600 nascent entrepreneurs in Australia indicated that there is marginal support for the hypothesis that relatedness assists the gestation progress. Newness is significant but is the opposite sign to the hypothesized. The results give number of implications for researchers, business founders, consultants and policy makers in terms of better knowledge of the venture creation process.
Resumo:
This project is an extension of a previous CRC project (220-059-B) which developed a program for life prediction of gutters in Queensland schools. A number of sources of information on service life of metallic building components were formed into databases linked to a Case-Based Reasoning Engine which extracted relevant cases from each source.
Resumo:
This project is an extension of a previous CRC project (220-059-B) which developed a program for life prediction of gutters in Queensland schools. A number of sources of information on service life of metallic building components were formed into databases linked to a Case-Based Reasoning Engine which extracted relevant cases from each source.