778 resultados para generic model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Computational modelling of mechanisms underlying processes in the real world can be of great value in understanding complex biological behaviours. Uptake in general biology and ecology has been rapid. However, it often requires specific data sets that are overly costly in time and resources to collect. The aim of the current study was to test whether a generic behavioural ecology model constructed using published data could give realistic outputs for individual species. An individual-based model was developed using the Pattern-Oriented Modelling (POM) strategy and protocol, based on behavioural rules associated with insect movement choices. Frugivorous Tephritidae (fruit flies) were chosen because of economic significance in global agriculture and the multiple published data sets available for a range of species. The Queensland fruit fly (Qfly), Bactrocera tryoni, was identified as a suitable individual species for testing. Plant canopies with modified architecture were used to run predictive simulations. A field study was then conducted to validate our model predictions on how plant architecture affects fruit flies’ behaviours. Characteristics of plant architecture such as different shapes, e.g., closed-canopy and vase-shaped, affected fly movement patterns and time spent on host fruit. The number of visits to host fruit also differed between the edge and centre in closed-canopy plants. Compared to plant architecture, host fruit has less contribution to effects on flies’ movement patterns. The results from this model, combined with our field study and published empirical data suggest that placing fly traps in the upper canopy at the edge should work best. Such a modelling approach allows rapid testing of ideas about organismal interactions with environmental substrates in silico rather than in vivo, to generate new perspectives. Using published data provides a saving in time and resources. Adjustments for specific questions can be achieved by refinement of parameters based on targeted experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adoption of e-business by Small and Medium Enterprises (SMEs) in construction lags from other service and product businesses within the building sector. This paper develops a model to facilitate the uptake of electronic business, especially in relation to SMEs within the Australian construction sector. Ebusiness is defined here as “the undertaking of business-related transactions, communications and information exchanges utilising electronic medium and environment”, the elicited model highlights significant changes needed including skills development, social, economic and cultural issues. The model highlights barriers for SMEs to migrate towards e-transactions, e-bidding, e-tendering and ecollaboration and provides learning and skills development components. The model is derived from case study fieldwork and is to inform diffusion and awareness models for best practice. Empirical techniques included ‘focus group’ interviews and one to one ‘interviews’. Data was transcribed and analysed using cluster analyses. Preliminary results reveal that current models for e-business adoption are not effective within the construction context as they have emerged from other service and product industries - such as retail or tourism. These generic models have largely ignored the nature of the construction industry, and some modifications appears to be required. This paper proposes an alternative adoption model which is more sensitive to the nature of the industry – particularly for e-business uptake in building SME’s.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider one-round key exchange protocols secure in the standard model. The security analysis uses the powerful security model of Canetti and Krawczyk and a natural extension of it to the ID-based setting. It is shown how KEMs can be used in a generic way to obtain two different protocol designs with progressively stronger security guarantees. A detailed analysis of the performance of the protocols is included; surprisingly, when instantiated with specific KEM constructions, the resulting protocols are competitive with the best previous schemes that have proofs only in the random oracle model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider one-round key exchange protocols secure in the standard model. The security analysis uses the powerful security model of Canetti and Krawczyk and a natural extension of it to the ID-based setting. It is shown how KEMs can be used in a generic way to obtain two different protocol designs with progressively stronger security guarantees. A detailed analysis of the performance of the protocols is included; surprisingly, when instantiated with specific KEM constructions, the resulting protocols are competitive with the best previous schemes that have proofs only in the random oracle model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider one-round key exchange protocols secure in the standard model. The security analysis uses the powerful security model of Canetti and Krawczyk and a natural extension of it to the ID-based setting. It is shown how KEMs can be used in a generic way to obtain two different protocol designs with progressively stronger security guarantees. A detailed analysis of the performance of the protocols is included; surprisingly, when instantiated with specific KEM constructions, the resulting protocols are competitive with the best previous schemes that have proofs only in the random oracle model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing growth of cultural events both in Australia and internationally, there has also been an increase in event management studies; in theory and in practice. Although a series of related knowledge and skills required specifically by event managers has already been identified by many researchers (Perry et al., 1996; Getz, 2002 & Silvers et al., 2006) and generic event management models proposed, including ‘project management’ strategies in an event context (Getz, 2007), knowledge gaps still exist in relation to identifying specific types of events, especially for not-for-profit arts events. For events of a largely voluntary nature, insufficient resources are recognised as the most challenging; including finance, human resources and infrastructure. Therefore, the concepts and principles which are adopted by large scale commercial events may not be suitable for not-for-profit arts events aiming at providing professional network opportunities for artists. Building partnerships are identified as a key strategy in developing an effective event management model for this type of event. Using the 2008 World Dance Alliance Global Summit (WDAGS) in Brisbane 13-18 July, as a case study, the level, nature and relationship of key partners are investigated. Data is triangulated from interviews with organisers of the 2008 WDAGS, on-line and email surveys of delegates, participant observation and analysis of formal and informal documents, to produce a management model suited to this kind of event.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We give a direct construction of a certificateless key encapsulation mechanism (KEM) in the standard model that is more efficient than the generic constructions proposed before by Huang and Wong \cite{DBLP:conf/acisp/HuangW07}. We use a direct construction from Kiltz and Galindo's KEM scheme \cite{DBLP:conf/acisp/KiltzG06} to obtain a certificateless KEM in the standard model; our construction is roughly twice as efficient as the generic construction. We also address the security flaw discovered by Selvi et al. \cite{cryptoeprint:2009:462}.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The quality of stormwater runoff from ports is significant as it can be an important source of pollution to the marine environment. This is also a significant issue for the Port of Brisbane as it is located in an area of high environmental values. Therefore, it is imperative to develop an in-depth understanding of stormwater runoff quality to ensure that appropriate strategies are in place for quality improvement, where necessary. To this end, the Port of Brisbane Corporation aimed to develop a port specific stormwater model for the Fisherman Islands facility. The need has to be considered in the context of the proposed future developments of the Port area. ----------------- The Project: The research project is an outcome of the collaborative Partnership between the Port of Brisbane Corporation (POBC) and Queensland University of Technology (QUT). A key feature of this Partnership is that it seeks to undertake research to assist the Port in strengthening the environmental custodianship of the Port area through ‘cutting edge’ research and its translation into practical application. ------------------ The project was separated into two stages. The first stage developed a quantitative understanding of the generation potential of pollutant loads in the existing land uses. This knowledge was then used as input for the stormwater quality model developed in the subsequent stage. The aim is to expand this model across the yet to be developed port expansion area. This is in order to predict pollutant loads associated with stormwater flows from this area with the longer term objective of contributing to the development of ecological risk mitigation strategies for future expansion scenarios. ----------------- Study approach: Stage 1 of the overall study confirmed that Port land uses are unique in terms of the anthropogenic activities occurring on them. This uniqueness in land use results in distinctive stormwater quality characteristics different to other conventional urban land uses. Therefore, it was not scientifically valid to consider the Port as belonging to a single land use category or to consider as being similar to any typical urban land use. The approach adopted in this study was very different to conventional modelling studies where modelling parameters are developed using calibration. The field investigations undertaken in Stage 1 of the overall study helped to create fundamental knowledge on pollutant build-up and wash-off in different Port land uses. This knowledge was then used in computer modelling so that the specific characteristics of pollutant build-up and wash-off can be replicated. This meant that no calibration processes were involved due to the use of measured parameters for build-up and wash-off. ---------------- Conclusions: Stage 2 of the study was primarily undertaken using the SWMM stormwater quality model. It is a physically based model which replicates natural processes as closely as possible. The time step used and catchment variability considered was adequate to accommodate the temporal and spatial variability of input parameters and the parameters used in the modelling reflect the true nature of rainfall-runoff and pollutant processes to the best of currently available knowledge. In this study, the initial loss values adopted for the impervious surfaces are relatively high compared to values noted in research literature. However, given the scientifically valid approach used for the field investigations, it is appropriate to adopt the initial losses derived from this study for future modelling of Port land uses. The relatively high initial losses will reduce the runoff volume generated as well as the frequency of runoff events significantly. Apart from initial losses, most of the other parameters used in SWMM modelling are generic to most modelling studies. Development of parameters for MUSIC model source nodes was one of the primary objectives of this study. MUSIC, uses the mean and standard deviation of pollutant parameters based on a normal distribution. However, based on the values generated in this study, the variation of Event Mean Concentrations (EMCs) for Port land uses within the given investigation period does not fit a normal distribution. This is possibly due to the fact that only one specific location was considered, namely the Port of Brisbane unlike in the case of the MUSIC model where a range of areas with different geographic and climatic conditions were investigated. Consequently, the assumptions used in MUSIC are not totally applicable for the analysis of water quality in Port land uses. Therefore, in using the parameters included in this report for MUSIC modelling, it is important to note that it may result in under or over estimations of annual pollutant loads. It is recommended that the annual pollutant load values given in the report should be used as a guide to assess the accuracy of the modelling outcomes. A step by step guide for using the knowledge generated from this study for MUSIC modelling is given in Table 4.6. ------------------ Recommendations: The following recommendations are provided to further strengthen the cutting edge nature of the work undertaken: * It is important to further validate the approach recommended for stormwater quality modelling at the Port. Validation will require data collection in relation to rainfall, runoff and water quality from the selected Port land uses. Additionally, the recommended modelling approach could be applied to a soon-to-be-developed area to assess ‘before’ and ‘after’ scenarios. * In the modelling study, TSS was adopted as the surrogate parameter for other pollutants. This approach was based on other urban water quality research undertaken at QUT. The validity of this approach should be further assessed for Port land uses. * The adoption of TSS as a surrogate parameter for other pollutants and the confirmation that the <150 m particle size range was predominant in suspended solids for pollutant wash-off gives rise to a number of important considerations. The ability of the existing structural stormwater mitigation measures to remove the <150 m particle size range need to be assessed. The feasibility of introducing source control measures as opposed to end-of-pipe measures for stormwater quality improvement may also need to be considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a generic decoupled imagebased control scheme for cameras obeying the unified projection model. The scheme is based on the spherical projection model. Invariants to rotational motion are computed from this projection and used to control the translational degrees of freedom. Importantly we form invariants which decrease the sensitivity of the interaction matrix to object depth variation. Finally, the proposed results are validated with experiments using a classical perspective camera as well as a fisheye camera mounted on a 6-DOF robotic platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Car Following models have a critical role in all microscopic traffic simulation models. Current microscopic simulation models are unable to mimic the unsafe behaviour of drivers as most are based on presumptions about the safe behaviour of drivers. Gipps model is a widely used car following model embedded in different micro-simulation models. This paper examines the Gipps car following model to investigate ways of improving the model for safety studies application. The paper puts forward some suggestions to modify the Gipps model to improve its capabilities to simulate unsafe vehicle movements (vehicles with safety indicators below critical thresholds). The result of the paper is one step forward to facilitate assessing and predicting safety at motorways using microscopic simulation. NGSIM as a rich source of vehicle trajectory data for a motorway is used to extract its relatively risky events. Short following headways and Time To Collision are used to assess critical safety event within traffic flow. The result shows that the modified proposed car following to a certain extent predicts the unsafe trajectories with smaller error values than the generic Gipps model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The uncertain and dynamic nature of International Construction Joint Venture (ICJV) performance is evolved with many critical factors which lead to make partner relationships more complex in respect of making decisions to maintain a cohesive environment. Addressing to the fact, a generic system dynamics performance model for ICJV is developed by integrating a number variables as to get an overall impact on performance of ICJV and to make effective decisions based on that. In order to formulate and validate the model both structurally and behaviourally, both qualitative and quantitative data are gathered by conducting intensive interviews from two ICJVs in Thailand. After conducting intensive simulations of model, three major problems are identified related to negative value gap, low productivity in construction and high rate of ineffective information sharing of both ICJVs. Several policies are suggested and integrated application of these policies provides a maximum improvement to performance of the ICJV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper investigates train scheduling problems when prioritised trains and non-prioritised trains are simultaneously traversed in a single-line rail network. In this case, no-wait conditions arise because the prioritised trains such as express passenger trains should traverse continuously without any interruption. In comparison, non-prioritised trains such as freight trains are allowed to enter the next section immediately if possible or to remain in a section until the next section on the routing becomes available, which is thought of as a relaxation of no-wait conditions. With thorough analysis of the structural properties of the No-Wait Blocking Parallel-Machine Job-Shop-Scheduling (NWBPMJSS) problem that is originated in this research, an innovative generic constructive algorithm (called NWBPMJSS_Liu-Kozan) is proposed to construct the feasible train timetable in terms of a given order of trains. In particular, the proposed NWBPMJSS_Liu-Kozan constructive algorithm comprises several recursively-used sub-algorithms (i.e. Best-Starting-Time-Determination Procedure, Blocking-Time-Determination Procedure, Conflict-Checking Procedure, Conflict-Eliminating Procedure, Tune-up Procedure and Fine-tune Procedure) to guarantee feasibility by satisfying the blocking, no-wait, deadlock-free and conflict-free constraints. A two-stage hybrid heuristic algorithm (NWBPMJSS_Liu-Kozan-BIH) is developed by combining the NWBPMJSS_Liu-Kozan constructive algorithm and the Best-Insertion-Heuristic (BIH) algorithm to find the preferable train schedule in an efficient and economical way. Extensive computational experiments show that the proposed methodology is promising because it can be applied as a standard and fundamental toolbox for identifying, analysing, modelling and solving real-world scheduling problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper summarises some of the recent studies on various types of learning approaches that have utilised some form of Web 2.0 services in curriculum design to enhance learning. A generic implementation model of this integration will then be presented to illustrate the overall learning implementation process. Recently, the integration of Web 2.0 technologies into learning curriculum has begun to get a wide acceptance among teaching instructors across various higher learning institutions. This is evidenced by numerous studies which indicate the implementation of a range of Web 2.0 technologies into their learning design to improve learning delivery. Moreover, recent studies also have shown that the ability of current students to embrace Web 2.0 technologies is better than students using existing learning technology. Despite various attempts made by teachers in relation to the integration, researchers have noted a lack of integration standard to help in curriculum design. The absence of this standard will restrict the capacity of Web 2.0 adaptation into learning and adding more the complexity to provide meaningful learning. Therefore, this paper will attempt to draw a conceptual integration model which is being generated to reflect how learning activities with some facilitation of Web 2.0 is currently being implemented. The design of this model is based on shared experiences by many scholars as well as feedback gathered from two separate surveys conducted on teachers and a group of 180 students. Furthermore, this paper also recognizes some key components that generally engage in the design of a Web 2.0 teaching and learning which need to be addressed accordingly. Overall, the content of this paper will be organised as follows. The first part of the paper will introduce the importance of Web 2.0 implementation in teaching and learning from the perspective of higher education institutions and those challenges surrounding this area. The second part summarizes related works done in this field and brings forward the concept of designing learning with the incorporation of Web 2.0 technology. The next part presents the results of analysis derived from the two student and teachers surveys on using Web 2.0 during learning activities. This paper concludes by presenting a model that reflects several key entities that may be involved during the learning design.