94 resultados para Guarantee
Resumo:
In this paper, weighted fair rate allocation for ATM available bit rate (ABR) service is discussed with the concern of the minimum cell rate (MCR). Weighted fairness with MCR guarantee has been discussed recently in the literature. In those studies, each ABR virtual connection (VC) is first allocated its MCR, then the remaining available bandwidth is further shared among ABR VCs according to their weights. For the weighted fairness defined in this paper, the bandwidth is first allocated according to each VC's weight; if a VC's weighted share is less than its MCR, it should be allocated its MCR instead of the weighted share. This weighted fairness with MCR guarantee is referred to as extended weighted (EXW) fairness. Certain theoretical issues related to EXW, such as its global solution and bottleneck structure, are first discussed in the paper. A distributed explicit rate allocation algorithm is then proposed to achieve EXW fairness in ATM networks. The algorithm is a general-purpose explicit rate algorithm in the sense that it can realise almost all the fairness principles proposed for ABR so far whilst only minor modifications may be needed.
Resumo:
This article discusses what recent statistics and public reports reveal about the funding of GEERS (now the FEG) and its bottom line. The article examines (1) whether there has been a “blowout” in the scheme which guarantees the recovery of employee entitlements in liquidations and (2) what might be done to put the scheme on a firmer fiscal footing.
Resumo:
Product Lifecycle Management (PLM) systems are widely used in the manufacturing industry. A core feature of such systems is to provide support for versioning of product data. As workflow functionality is increasingly used in PLM systems, the possibility emerges that the versioning transitions for product objects as encapsulated in process models do not comply with the valid version control policies mandated in the objects’ actual lifecycles. In this paper we propose a solution to tackle the (non-)compliance issues between processes and object version control policies. We formally define the notion of compliance between these two artifacts in product lifecycle management and then develop a compliance checking method which employs a well-established workflow analysis technique. This forms the basis of a tool which offers automated support to the proposed approach. By applying the approach to a collection of real-life specifications in a main PLM system, we demonstrate the practical applicability of our solution to the field.
Resumo:
Denial-of-service attacks (DoS) and distributed denial-of-service attacks (DDoS) attempt to temporarily disrupt users or computer resources to cause service un- availability to legitimate users in the internetworking system. The most common type of DoS attack occurs when adversaries °ood a large amount of bogus data to interfere or disrupt the service on the server. The attack can be either a single-source attack, which originates at only one host, or a multi-source attack, in which multiple hosts coordinate to °ood a large number of packets to the server. Cryptographic mechanisms in authentication schemes are an example ap- proach to help the server to validate malicious tra±c. Since authentication in key establishment protocols requires the veri¯er to spend some resources before successfully detecting the bogus messages, adversaries might be able to exploit this °aw to mount an attack to overwhelm the server resources. The attacker is able to perform this kind of attack because many key establishment protocols incorporate strong authentication at the beginning phase before they can iden- tify the attacks. This is an example of DoS threats in most key establishment protocols because they have been implemented to support con¯dentiality and data integrity, but do not carefully consider other security objectives, such as availability. The main objective of this research is to design denial-of-service resistant mechanisms in key establishment protocols. In particular, we focus on the design of cryptographic protocols related to key establishment protocols that implement client puzzles to protect the server against resource exhaustion attacks. Another objective is to extend formal analysis techniques to include DoS- resistance. Basically, the formal analysis approach is used not only to analyse and verify the security of a cryptographic scheme carefully but also to help in the design stage of new protocols with a high level of security guarantee. In this research, we focus on an analysis technique of Meadows' cost-based framework, and we implement DoS-resistant model using Coloured Petri Nets. Meadows' cost-based framework is directly proposed to assess denial-of-service vulnerabil- ities in the cryptographic protocols using mathematical proof, while Coloured Petri Nets is used to model and verify the communication protocols using inter- active simulations. In addition, Coloured Petri Nets are able to help the protocol designer to clarify and reduce some inconsistency of the protocol speci¯cation. Therefore, the second objective of this research is to explore vulnerabilities in existing DoS-resistant protocols, as well as extend a formal analysis approach to our new framework for improving DoS-resistance and evaluating the performance of the new proposed mechanism. In summary, the speci¯c outcomes of this research include following results; 1. A taxonomy of denial-of-service resistant strategies and techniques used in key establishment protocols; 2. A critical analysis of existing DoS-resistant key exchange and key estab- lishment protocols; 3. An implementation of Meadows's cost-based framework using Coloured Petri Nets for modelling and evaluating DoS-resistant protocols; and 4. A development of new e±cient and practical DoS-resistant mechanisms to improve the resistance to denial-of-service attacks in key establishment protocols.
Resumo:
Principal Topic In this paper we seek to highlight the important intermediate role that the gestation process plays in entrepreneurship by examining its key antecedents and its consequences for new venture emergence. In doing so we take a behavioural perspective and argue that it is not only what a nascent venture is, but what it does (Katz & Gartner, 1988; Shane & Delmar, 2004; Reynolds, 2007) and when it does it during start-up (Reynolds & Miller, 1992; Lichtenstein, Carter, Dooley & Gartner, 2007) that is important. To extend an analogy from biological development, what we suggest is that the way a new venture is nurtured is just as fundamental as its nature. Much prior research has focused on the nature of new ventures and attempted to attribute variations in outcomes directly to the impact resource endowments and investments have. While there is little doubt that venture resource attributes such as human capital, and specifically prior entrepreneurial experience (Alsos & Kolvereid, 1998), access to social (Davidsson & Honig, 2003) and financial capital have an influence. Resource attributes themselves are distal from successful start-up endeavours and remain inanimate if not for the actions of the nascent venture. The key contribution we make is to shift focus from whether or not actions are taken, but when these actions happen and how that is situated in the overall gestation process. Thus, we suggest that it is gestation process dynamics, or when gestation actions occur, that is more proximal to venture outcomes and we focus on this. Recently scholars have highlighted the complexity that exists in the start-up or gestation process, be it temporal or contextual (Liao, Welsch & Tan, 2005; Lichtenstein et al. 2007). There is great variation in how long a start-up process might take (Reynolds & Miller, 1992), some processes require less action than others (Carter, Gartner & Reynolds, 1996), and the overall intensity of the start-up effort is also deemed important (Reynolds, 2007). And, despite some evidence that particular activities are more influential than others (Delmar & Shane, 2003), the order in which events may happen is, until now, largely indeterminate as regard its influence on success (Liao & Welsch, 2008). We suggest that it is this complexity of the intervening gestation process that attenuates the effect of resource endowment and has resulted in mixed findings in previous research. Thus, in order to reduce complexity we shall take a holistic view of the gestation process and argue that it is its’ dynamic properties that determine nascent venture attempt outcomes. Importantly, we acknowledge that particular gestation processes of themselves would not guarantee successful start-up, but it is more correctly the fit between the process dynamics and the ventures attributes (Davidsson, 2005) that is influential. So we aim to examine process dynamics by comparing sub-groups of venture types by resource attributes. Thus, as an initial step toward unpacking the complexity of the gestation process, this paper aims to establish the importance of its role as an intermediary between attributes of the nascent venture and the emergence of that venture. Here, we make a contribution by empirically examining gestation process dynamics and their fit with venture attributes. We do this by firstly, examining that nature of the influence that venture attributes such as human and social capital have on the dynamics of the gestation process, and secondly by investigating the effect that gestation process dynamics have on venture creation outcomes. Methodology and Propositions In order to explore the importance that gestation processes dynamics have in nascent entrepreneurship we conduct an empirical study of ventures start-ups. Data is drawn from a screened random sample of 625 Australian nascent business ventures prior to them achieving consistent outcomes in the market. This data was collected during 2007/8 and 2008/9 as part of the Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE) project (Davidsson et al., 2008). CAUSEE is a longitudinal panel study conducted over four years, sourcing information from annually administered telephone surveys. Importantly for our study, this methodology allows for the capture and tracking of active nascent venture creation as it happens, thus reducing hindsight and selection biases. In addition, improved tests of causality may be made given that outcome measures are temporally removed from preceding events. The data analysed in this paper represents the first two of these four years, and for the first time has access to follow-up outcome measures for these venture attempts: where 260 were successful, 126 were abandoned, and 191 are still in progress. With regards to venture attributes as gestation process antecedents, we examine specific human capital measured as successful prior experience in entrepreneurship, and direct social capital of the venture as ‘team start-ups’. In assessing gestation process dynamics we follow Lichtenstein et al. (2007) to suggest that the rate, concentration and timing of gestation activities may be used to summarise the complexity dynamics of that process. In addition, we extend this set of measures to include the interaction of discovery and exploitation by way of changes made to the venture idea. Those ventures with successful prior experience or those who conduct symbiotic parallel start-up attempts may be able to, or be forced to, leave their gestation action until later and still derive a successful outcome. In addition access to direct social capital may provide the support upon which the venture may draw in order to persevere in the face of adversity, turning a seemingly futile start-up attempt into a success. On the other hand prior experience may engender the foresight to terminate a venture attempt early should it be seen to be going nowhere. The temporal nature of these conjectures highlight the importance that process dynamics play and will be examined in this research Statistical models are developed to examine gestation process dynamics. We use multivariate general linear modelling to analyse how human and social capital factors influence gestation process dynamics. In turn, we use event history models and stratified Cox regression to assess the influence that gestation process dynamics have on venture outcomes. Results and Implications What entrepreneurs do is of interest to both scholars and practitioners’ alike. Thus the results of this research are important since they focus on nascent behaviour and its outcomes. While venture attributes themselves may be influential this is of little actionable assistance to practitioners. For example it is unhelpful to say to the prospective first time entrepreneur “you’ll be more successful if you have lots of prior experience in firm start-ups”. This research attempts to close this relevance gap by addressing what gestation behaviours might be appropriate, when actions best be focused, and most importantly in what circumstances. Further, we make a contribution to the entrepreneurship literature, examining the role that gestation process dynamics play in outcomes, by specifically attributing these to the nature of the venture itself. This extension is to the best of our knowledge new to the research field.
Resumo:
This paper presents a novel algorithm for the gateway placement problem in Backbone Wireless Mesh Networks (BWMNs). Different from existing algorithms, the new algorithm incrementally identifies gateways and assigns mesh routers to identified gateways. The new algorithm can guarantee to find a feasible gateway placement satisfying Quality-of-Service (QoS) constraints, including delay constraint, relay load constraint and gateway capacity constraint. Experimental results show that its performance is as good as that of the best of existing algorithms for the gateway placement problem. But, the new algorithm can be used for BWMNs that do not form one connected component, and it is easy to implement and use.
Resumo:
Franchising has been widely accepted as an effective way to conduct and expand businesses. However, a franchise system is not a guarantee of success in the market. A successful franchise system should rely on a close and strong franchising relationship. Franchising is an important relationship management business. Franchising arrangements normally last for a number of years, so the franchisor and franchisee in the arrangement relationship are usually motivated to cooperate with each other. In addition, highly loyal franchisees may be obtained through a successful long-term franchising relationship. Over the last few decades, there has been a tremendous wave of interest in franchising relationships. However, little research has been conducted to determine the reasons for long-term franchising relationships. As a result, this study focuses on the important elements that might lead to a successful long-term franchising relationship. This study attempts to examine empirically three essential constructs (relationship quality, cooperation and customer loyalty), which might lead to successful long-term franchising relationships between franchisees and franchisors among the convenience stores in Taiwan. Mailed questionnaires were utilised to collect the research data. A total of 500 surveys were mailed randomly to the manager/supervisor of convenience stores’ franchisees among the four main franchisors (7-ELEVEN, Family, Hi-Life and OK) in Taiwan. The final sample size is 120, yielding a response rate of 24 per cent. The results show that relationship quality positively influences the cooperative relationships between franchisors and franchisees. Relationship quality is also positively correlated with franchisees’ loyalty. Additionally, the results indicate that the cooperative relationships between franchisors and franchisees are significantly associated with franchisees’ loyalty.
Resumo:
Assessment plays an integral role in teaching and learning in Higher Education and teachers have a strong interest in debates and commentaries on assessment as and for learning. In a one-year graduate entry teacher preparation program, the temptation is to emphasize assessment in an attempt to ensure students “cover” everything as part of a robust preparation for the profession. The risk is that, for students, assessment drives curriculum, and time spent in the completion of assignments is no guarantee of either effective learning or authentic preparation for teaching. Interviews as assessment provide an opportunity for a learning experience as well as an authentic task, since students will shortly be interviewing for employment in a “real world” situation. This paper reports on a project experimenting with interview panels as authentic assessment with pre-service early childhood teachers. At the end of their first semester of study, students enrolled in the Graduate Diploma of Education program at the Queensland University of Technology in Australia were required to participate in a panel interview where they were graded by a panel made up of three faculty staff and one undergraduate student enrolled in the four-year Bachelor of Education program. Students and panel members completed a questionnaire on their experience after the interview. Results indicated that both students and staff valued the experience and felt it was authentic. Results are discussed in terms of how the assessment interview and portfolio presentation supports graduating students in their preparation for employment interviews, and how this authentic assessment task has benefits for both students and teaching staff.
Resumo:
The effects of particulate matter on environment and public health have been widely studied in recent years. A number of studies in the medical field have tried to identify the specific effect on human health of particulate exposure, but agreement amongst these studies on the relative importance of the particles’ size and its origin with respect to health effects is still lacking. Nevertheless, air quality standards are moving, as the epidemiological attention, towards greater focus on the smaller particles. Current air quality standards only regulate the mass of particulate matter less than 10 μm in aerodynamic diameter (PM10) and less than 2.5 μm (PM2.5). The most reliable method used in measuring Total Suspended Particles (TSP), PM10, PM2.5 and PM1 is the gravimetric method since it directly measures PM concentration, guaranteeing an effective traceability to international standards. This technique however, neglects the possibility to correlate short term intra-day variations of atmospheric parameters that can influence ambient particle concentration and size distribution (emission strengths of particle sources, temperature, relative humidity, wind direction and speed and mixing height) as well as human activity patterns that may also vary over time periods considerably shorter than 24 hours. A continuous method to measure the number size distribution and total number concentration in the range 0.014 – 20 μm is the tandem system constituted by a Scanning Mobility Particle Sizer (SMPS) and an Aerodynamic Particle Sizer (APS). In this paper, an uncertainty budget model of the measurement of airborne particle number, surface area and mass size distributions is proposed and applied for several typical aerosol size distributions. The estimation of such an uncertainty budget presents several difficulties due to i) the complexity of the measurement chain, ii) the fact that SMPS and APS can properly guarantee the traceability to the International System of Measurements only in terms of number concentration. In fact, the surface area and mass concentration must be estimated on the basis of separately determined average density and particle morphology. Keywords: SMPS-APS tandem system, gravimetric reference method, uncertainty budget, ultrafine particles.
Resumo:
Modelling of interferometric signals related to tear film surface quality is considered. In the context of tear film surface quality estimation in normal healthy eyes, two clinical parameters are of interest: the build-up time, and the average interblink surface quality. The former is closely related to the signal derivative while the latter to the signal itself. Polynomial signal models, chosen for a particular set of noisy interferometric measurements, can be optimally selected, in some sense, with a range of information criteria such as AIC, MDL, Cp, and CME. Those criteria, however, do not always guarantee that the true derivative of the signal is accurately represented and they often overestimate it. Here, a practical method for judicious selection of model order in a polynomial fitting to a signal is proposed so that the derivative of the signal is adequately represented. The paper highlights the importance of context-based signal modelling in model order selection.
Resumo:
A new steady state method for determination of the electron diffusion length in dye-sensitized solar cells (DSCs) is described and illustrated with data obtained using cells containing three different types of electrolyte. The method is based on using near-IR absorbance methods to establish pairs of illumination intensity for which the total number of trapped electrons is the same at open circuit (where all electrons are lost by interfacial electron transfer) as at short circuit (where the majority of electrons are collected at the contact). Electron diffusion length values obtained by this method are compared with values derived by intensity modulated methods and by impedance measurements under illumination. The results indicate that the values of electron diffusion length derived from the steady state measurements are consistently lower than the values obtained by the non steady-state methods. For all three electrolytes used in the study, the electron diffusion length was sufficiently high to guarantee electron collection efficiencies greater than 90%. Measurement of the trap distributions by near-IR absorption confirmed earlier observations of much higher electron trap densities for electrolytes containing Li+ ions. It is suggested that the electron trap distributions may not be intrinsic properties of the TiO2 nanoparticles, but may be associated with electron-ion interactions.
Resumo:
There has been considerable interest over the years within the IS research community into how to shape articles for successful publication. Little effort has been made, however, to examine the reviewing criteria that make a difference to publication. We argue that, to provide better guidance to authors, more solid evidence is needed into the factors that contribute to acceptance decisions. This paper examines empirically the outcomes of the reviewing processes of three well-known IS conferences held in 2007. Our analyses reveal four major findings. First, the evaluation criteria that influence the acceptance/rejection decision vary by conference. Second, those differences can be explained in terms of the maturity and breadth of the specific conference of interest. Third, while objective review criteria influence acceptance/rejection decisions, subjective assessment on the part of the program committees may also play a substantial role. Fourth, while high scores on objective criteria are essential for acceptance, they do not guarantee acceptance. On the other hand, low scores on any criterion are likely to result in rejection.
Resumo:
This thesis is about the derivation of the addition law on an arbitrary elliptic curve and efficiently adding points on this elliptic curve using the derived addition law. The outcomes of this research guarantee practical speedups in higher level operations which depend on point additions. In particular, the contributions immediately find applications in cryptology. Mastered by the 19th century mathematicians, the study of the theory of elliptic curves has been active for decades. Elliptic curves over finite fields made their way into public key cryptography in late 1980’s with independent proposals by Miller [Mil86] and Koblitz [Kob87]. Elliptic Curve Cryptography (ECC), following Miller’s and Koblitz’s proposals, employs the group of rational points on an elliptic curve in building discrete logarithm based public key cryptosystems. Starting from late 1990’s, the emergence of the ECC market has boosted the research in computational aspects of elliptic curves. This thesis falls into this same area of research where the main aim is to speed up the additions of rational points on an arbitrary elliptic curve (over a field of large characteristic). The outcomes of this work can be used to speed up applications which are based on elliptic curves, including cryptographic applications in ECC. The aforementioned goals of this thesis are achieved in five main steps. As the first step, this thesis brings together several algebraic tools in order to derive the unique group law of an elliptic curve. This step also includes an investigation of recent computer algebra packages relating to their capabilities. Although the group law is unique, its evaluation can be performed using abundant (in fact infinitely many) formulae. As the second step, this thesis progresses the finding of the best formulae for efficient addition of points. In the third step, the group law is stated explicitly by handling all possible summands. The fourth step presents the algorithms to be used for efficient point additions. In the fifth and final step, optimized software implementations of the proposed algorithms are presented in order to show that theoretical speedups of step four can be practically obtained. In each of the five steps, this thesis focuses on five forms of elliptic curves over finite fields of large characteristic. A list of these forms and their defining equations are given as follows: (a) Short Weierstrass form, y2 = x3 + ax + b, (b) Extended Jacobi quartic form, y2 = dx4 + 2ax2 + 1, (c) Twisted Hessian form, ax3 + y3 + 1 = dxy, (d) Twisted Edwards form, ax2 + y2 = 1 + dx2y2, (e) Twisted Jacobi intersection form, bs2 + c2 = 1, as2 + d2 = 1, These forms are the most promising candidates for efficient computations and thus considered in this work. Nevertheless, the methods employed in this thesis are capable of handling arbitrary elliptic curves. From a high level point of view, the following outcomes are achieved in this thesis. - Related literature results are brought together and further revisited. For most of the cases several missed formulae, algorithms, and efficient point representations are discovered. - Analogies are made among all studied forms. For instance, it is shown that two sets of affine addition formulae are sufficient to cover all possible affine inputs as long as the output is also an affine point in any of these forms. In the literature, many special cases, especially interactions with points at infinity were omitted from discussion. This thesis handles all of the possibilities. - Several new point doubling/addition formulae and algorithms are introduced, which are more efficient than the existing alternatives in the literature. Most notably, the speed of extended Jacobi quartic, twisted Edwards, and Jacobi intersection forms are improved. New unified addition formulae are proposed for short Weierstrass form. New coordinate systems are studied for the first time. - An optimized implementation is developed using a combination of generic x86-64 assembly instructions and the plain C language. The practical advantages of the proposed algorithms are supported by computer experiments. - All formulae, presented in the body of this thesis, are checked for correctness using computer algebra scripts together with details on register allocations.