924 resultados para Specification
Resumo:
Business process models have traditionally been an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach for process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions as they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. Empirical data obtained in this study suggests that this approach may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.
Resumo:
A Bitcoin wallet is a set of private keys known to a user and which allow that user to spend any Bitcoin associated with those keys. In a hierarchical deterministic (HD) wallet, child private keys are generated pseudorandomly from a master private key, and the corresponding child public keys can be generated by anyone with knowledge of the master public key. These wallets have several interesting applications including Internet retail, trustless audit, and a treasurer allocating funds among departments. A specification of HD wallets has even been accepted as Bitcoin standard BIP32. Unfortunately, in all existing HD wallets---including BIP32 wallets---an attacker can easily recover the master private key given the master public key and any child private key. This vulnerability precludes use cases such as a combined treasurer-auditor, and some in the Bitcoin community have suspected that this vulnerability cannot be avoided. We propose a new HD wallet that is not subject to this vulnerability. Our HD wallet can tolerate the leakage of up to m private keys with a master public key size of O(m). We prove that breaking our HD wallet is at least as hard as the so-called "one more" discrete logarithm problem.
Resumo:
The quality of environmental decisions are gauged according to the management objectives of a conservation project. Management objectives are generally about maximising some quantifiable measure of system benefit, for instance population growth rate. They can also be defined in terms of learning about the system in question, in such a case actions would be chosen that maximise knowledge gain, for instance in experimental management sites. Learning about a system can also take place when managing practically. The adaptive management framework (Walters 1986) formally acknowledges this fact by evaluating learning in terms of how it will improve management of the system and therefore future system benefit. This is taken into account when ranking actions using stochastic dynamic programming (SDP). However, the benefits of any management action lie on a spectrum from pure system benefit, when there is nothing to be learned about the system, to pure knowledge gain. The current adaptive management framework does not permit management objectives to evaluate actions over the full range of this spectrum. By evaluating knowledge gain in units distinct to future system benefit this whole spectrum of management objectives can be unlocked. This paper outlines six decision making policies that differ across the spectrum of pure system benefit through to pure learning. The extensions to adaptive management presented allow specification of the relative importance of learning compared to system benefit in management objectives. Such an extension means practitioners can be more specific in the construction of conservation project objectives and be able to create policies for experimental management sites in the same framework as practical management sites.
Resumo:
As a result of the more distributed nature of organisations and the inherently increasing complexity of their business processes, a significant effort is required for the specification and verification of those processes. The composition of the activities into a business process that accomplishes a specific organisational goal has primarily been a manual task. Automated planning is a branch of artificial intelligence (AI) in which activities are selected and organised by anticipating their expected outcomes with the aim of achieving some goal. As such, automated planning would seem to be a natural fit to the BPM domain to automate the specification of control flow. A number of attempts have been made to apply automated planning to the business process and service composition domain in different stages of the BPM lifecycle. However, a unified adoption of these techniques throughout the BPM lifecycle is missing. As such, we propose a new intention-centric BPM paradigm, which aims on minimising the specification effort by exploiting automated planning techniques to achieve a pre-stated goal. This paper provides a vision on the future possibilities of enhancing BPM using automated planning. A research agenda is presented, which provides an overview of the opportunities and challenges for the exploitation of automated planning in BPM.
Resumo:
This study examined the role of heparan sulfate proteoglycans (HSPGs) in neural lineage differentiation of human mesenchymal stem cells (hMSCs). Several HSPGs were identified as potential new targets controlling neural fate specification and may be applied to the development of improved models to examine and repair brain damage. hMSCs were characterised throughout extended in vitro expansion for neural lineage potential (neurons, astrocytes, oligodendrocytes) and differentiated using terminal differentiation and intermediate sphere formation. Brain damage and neurological disorders caused by injury or disease affect a large number of people often resulting in lifelong disabilities. Multipotent mesenchymal stem cells have a large capacity for self-renewal and provide an excellent model to examine the regulation and contribution of both stem cells and their surrounding microenvironment to the repair of neural tissue damage.
Resumo:
This project examined the role that written specifications play in the building procurement process and the relationship that specifications should have with respect to the use of BIM within the construction industry. A three-part approach was developed to integrate specifications, product libraries and BIM. Typically handled by different disciplines within project teams, these provide the basis for a holistic approach to the development of building descriptions through the design process and into construction.
Resumo:
The motivation for this analysis is the recently developed Excellence in Research for Australia (ERA) program developed to assess the quality of research in Australia. The objective is to develop an appropriate empirical model that better represents the underlying production of higher education research. In general, past studies on university research performance have used standard DEA models with some quantifiable research outputs. However, these suffer from the twin maladies of an inappropriate production specification and a lack of consideration of the quality of output. By including the qualitative attributes of peer-reviewed journals, we develop a procedure that captures both quality and quantity, and apply it using a network DEA model. Our main finding is that standard DEA models tend to overstate the research efficiency of most Australian universities.
Resumo:
Engineers and asset managers must often make decisions on how to best allocate limited resources amongst different interrelated activities, including repair, renewal, inspection, and procurement of new assets. The presence of project interdependencies and the lack of sufficient information on the true value of an activity often produce complex problems and leave the decision maker guessing about the quality and robustness of their decision. In this paper, a decision support framework for uncertain interrelated activities is presented. The framework employs a methodology for multi-criteria ranking in the presence of uncertainty, detailing the effect that uncertain valuations may have on the priority of a particular activity. The framework employs employing semi-quantitative risk measures that can be tailored to an organisation and enable a transparent and simple-to-use uncertainty specification by the decision maker. The framework is then demonstrated on a real world project set from a major Australian utility provider.
Resumo:
Application of "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A concentrated plasticity formulation suitable for practical advanced analysis of steel frame structures comprising non-compact sections is presented in this paper. This formulation, referred to as the refined plastic hinge method, implicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling.
Resumo:
Despite longstanding concern with the dimensionality of the service quality construct as measured by ServQual and IS-ServQual instruments, variations on the IS-ServQual instrument have been enduringly prominent in both academic research and practice in the field of IS. We explain the continuing popularity of the instrument based on the salience of the item set for predicting overall customer satisfaction, suggesting that the preoccupation with the dimensions has been a distraction. The implicit mutual exclusivity of the items suggests a more appropriate conceptualization of IS-ServQual as a formative index. This conceptualization resolves the paradox in IS-ServQual research, that of how an instrument with such well-known and well-documented weaknesses continue to be very influential and widely used by academics and practitioners. A formative conceptualization acknowledges and addresses the criticisms of IS-ServQual, while simultaneously explaining its enduring salience by focusing on the items rather than the “dimensions.” By employing an opportunistic sample and adopting the most recent IS-ServQual instrument published in a leading IS journal (virtually, any valid IS- ServQual sample in combination with a previously tested instrument variant would suffice for study purposes), we demonstrate that when re-specified as both first-order and second-order formatives, IS-ServQual has good model quality metrics and high predictive power on customer satisfaction. We conclude that this formative specification has higher practical use and is more defensible theoretically.
Resumo:
Bearing failure is a form of localized failure that occurs when thin-walled cold-formed steel sections are subjected to concentrated loads or support reactions. To determine the bearing capacity of cold-formed channel sections, a unified design equation with different bearing coefficients is given in the current North American specification AISI S100 and the Australian/New Zealand standard AS/NZS 4600. However, coefficients are not available for unlipped channel sections that are normally fastened to supports through their flanges. Eurocode 3 Part 1.3 includes bearing capacity equations for different load cases, but does not distinguish between fastened and unfastened support conditions. Therefore, an experimental study was conducted to determine the bearing capacities of these sections as used in floor systems. Twenty-eight web bearing tests on unlipped channel sections with restrained flanges were conducted under End One Flange (EOF) and Interior One Flange (IOF) load cases. Using the results from this study, a new equation was proposed within the AISI S100 and AS/NZS 4600 guidelines to determine the bearing capacities of cold-formed unlipped channels with flanges fastened to supports. A new design rule was also proposed based on the direct strength method.
Resumo:
Norfolk Island is an Australian external territory in Oceania. The significant road safety reforms in Australia from the 1970s onward bypassed the island, and most road safety ‘silver bullets’ adopted in other Australian jurisdictions were not introduced. While legislative amendments in 2010 introduced mandatory seat belt wearing for vehicle occupants on Norfolk Island, other critical issues face the community including drink driving by residents and visitors, occupant protection for vehicle passengers, and the provision of a more protective road environment. The release of the first Norfolk Island road safety strategy 2014-2016 proposed, inter alia: • a lower BAC of 0.05 and the introduction of compulsory driver alcohol and drug testing by police; • targeted enforcement of occupant protection for vehicle passengers, particularly for passengers riding on vehicle tray backs; • education interventions to challenge values held by some members of the community that support unsafe road use; • ensuring that driver information, training and testing is adequate for all drivers; • identification and rectification of hazardous roadside infrastructure, particularly barrier protection at “high drop locations” within the road network; and • developing a specification for vehicle standards for vehicles imported into Norfolk Island. Norfolk Island is engaging in a process of integration with the Australian community, and wider issues relating to funding and resources have impacted on the implementation of the road safety strategy. The response to the strategy will be discussed, particularly in terms of current attempts to address drink driving and the provision of a safer road environment.
Resumo:
The approach of generalized estimating equations (GEE) is based on the framework of generalized linear models but allows for specification of a working matrix for modeling within-subject correlations. The variance is often assumed to be a known function of the mean. This article investigates the impacts of misspecifying the variance function on estimators of the mean parameters for quantitative responses. Our numerical studies indicate that (1) correct specification of the variance function can improve the estimation efficiency even if the correlation structure is misspecified; (2) misspecification of the variance function impacts much more on estimators for within-cluster covariates than for cluster-level covariates; and (3) if the variance function is misspecified, correct choice of the correlation structure may not necessarily improve estimation efficiency. We illustrate impacts of different variance functions using a real data set from cow growth.
Resumo:
The method of generalised estimating equations for regression modelling of clustered outcomes allows for specification of a working matrix that is intended to approximate the true correlation matrix of the observations. We investigate the asymptotic relative efficiency of the generalised estimating equation for the mean parameters when the correlation parameters are estimated by various methods. The asymptotic relative efficiency depends on three-features of the analysis, namely (i) the discrepancy between the working correlation structure and the unobservable true correlation structure, (ii) the method by which the correlation parameters are estimated and (iii) the 'design', by which we refer to both the structures of the predictor matrices within clusters and distribution of cluster sizes. Analytical and numerical studies of realistic data-analysis scenarios show that choice of working covariance model has a substantial impact on regression estimator efficiency. Protection against avoidable loss of efficiency associated with covariance misspecification is obtained when a 'Gaussian estimation' pseudolikelihood procedure is used with an AR(1) structure.
Resumo:
Several articles in this journal have studied optimal designs for testing a series of treatments to identify promising ones for further study. These designs formulate testing as an ongoing process until a promising treatment is identified. This formulation is considered to be more realistic but substantially increases the computational complexity. In this article, we show that these new designs, which control the error rates for a series of treatments, can be reformulated as conventional designs that control the error rates for each individual treatment. This reformulation leads to a more meaningful interpretation of the error rates and hence easier specification of the error rates in practice. The reformulation also allows us to use conventional designs from published tables or standard computer programs to design trials for a series of treatments. We illustrate these using a study in soft tissue sarcoma.