863 resultados para inherent requirements
Resumo:
The development of methods providing reliable estimates of demographic parameters (e. g., survival rates, fecundity) for wild populations is essential to better understand the ecology and conservation requirements of individual species. A number of methods exist for estimating the demographics of stage-structured populations, but inherent mathematical complexity often limits their uptake by conservation practitioners. Estimating survival rates for pond-breeding amphibians is further complicated by their complex migratory and reproductive behaviours, often resulting in nonobservable states and successive cohorts of eggs and tadpoles. Here we used comprehensive data on 11 distinct breeding toad populations (Bufo calamita) to clarify and assess the suitability of a relatively simple method [the Kiritani-Nakasuji-Manly (KNM) method] to estimate the survival rates of stage-structured populations with overlapping life stages. The study shows that the KNM method is robust and provides realistic estimates of amphibian egg and larval survival rates for species in which breeding can occur as a single pulse or over a period of several weeks. The study also provides estimates of fecundity for seven distinct toad populations and indicates that it is essential to use reliable estimates of fecundity to limit the risk of under- or overestimating the survival rates when using the KNM method. Survival and fecundity rates for B. calamita populations were then used to define population matrices and make a limited exploration of their growth and viability. The findings of the study recently led to the implementation of practical conservation measures at the sites where populations were most vulnerable to extinction. © 2010 The Society of Population Ecology and Springer.
Resumo:
Purpose The UK government argues that the benefits of public private partnership (PPP) in delivering public infrastructure stem from: transferring risks to the private sector within a structure in which financiers put their own capital at risk; and, the performance based payment mechanism, reinforced by the due diligence requirements imposed by the lenders financing the projects (HM Treasury, 2010). Prior studies of risk in PPPs have investigated ‘what’ risks are allocated and to ‘whom’, that is to the public or the private sector. The purpose of this study is to examine ‘how’ and ‘why’ PPP risks are diffused by their financiers. Design/methodology/approach This study focuses on the financial structure of PPPs and on their financiers. Empirical evidence comes from interviews conducted with equity and debt financiers. Findings The findings show that the financial structure of the deals generates risk aversion in both debt and equity financiers and that the need to attract affordable finance leads to risk diffusion through a network of companies using various means that include contractual mitigation through insurance, performance support guarantees, interest rate swaps and inflation hedges. Because of the complexity this process generates, both procurers and suppliers need expensive expert advice. The risk aversion and diffusion and the consequent need for advice add cost to the projects impacting on the government’s economic argument for risk transfer. Limitations and implications The empirical work covers the private finance initiative (PFI) type of PPP arrangements and therefore the risk diffusion mechanisms may not be generalisable to other forms of PPP, especially those that do not involve the use of high leverage or private finance. Moreover, the scope of this research is limited to exploring the diffusion of risk in the private sector. Further research is needed on how risk is diffused in other settings and on the value for money implication of risk diffusion in PPP contracts. Originality/value The expectation inherent in PPP is that the private sector will better manage those risks allocated to it and because private capital is at risk, financiers will perform due diligence with the ultimate outcome that only viable projects will proceed. This paper presents empirical evidence that raises questions about these expectations. Key words: public private partnership, risk management, diffusion, private finance initiative, financiers
Resumo:
As a class of defects in software requirements specification, inconsistency has been widely studied in both requirements engineering and software engineering. It has been increasingly recognized that maintaining consistency alone often results in some other types of non-canonical requirements, including incompleteness of a requirements specification, vague requirements statements, and redundant requirements statements. It is therefore desirable for inconsistency handling to take into account the related non-canonical requirements in requirements engineering. To address this issue, we propose an intuitive generalization of logical techniques for handling inconsistency to those that are suitable for managing non-canonical requirements, which deals with incompleteness and redundancy, in addition to inconsistency. We first argue that measuring non-canonical requirements plays a crucial role in handling them effectively. We then present a measure-driven logic framework for managing non-canonical requirements. The framework consists of five main parts, identifying non-canonical requirements, measuring them, generating candidate proposals for handling them, choosing commonly acceptable proposals, and revising them according to the chosen proposals. This generalization can be considered as an attempt to handle non-canonical requirements along with logic-based inconsistency handling in requirements engineering.
Resumo:
Surface treatments are used as part of either a maintenance programme or repair work. In both cases, they provide additional protection to the concrete by either arresting or reducing the penetration of aggressive substances from the environment, Numerous materials are available for this purpose and their inherent generic properties differ considerably. Quite often this poses difficulties to practising engineers when selecting a surface treatment for a specific situation, In this review an attempt is made to explain the protective aspects of various surface treatments so that their selection can be made easier, The basic aspects of surface treatments are discussed: function, classification and performance requirements.
Towards an understanding of the causes and effects of software requirements change: two case studies
Resumo:
Changes to software requirements not only pose a risk to the successful delivery of software applications but also provide opportunity for improved usability and value. Increased understanding of the causes and consequences of change can support requirements management and also make progress towards the goal of change anticipation. This paper presents the results of two case studies that address objectives arising from that ultimate goal. The first case study evaluated the potential of a change source taxonomy containing the elements ‘market’, ‘organisation’, ‘vision’, ‘specification’, and ‘solution’ to provide a meaningful basis for change classification and measurement. The second case study investigated whether the requirements attributes of novelty, complexity, and dependency correlated with requirements volatility. While insufficiency of data in the first case study precluded an investigation of changes arising due to the change source of ‘market’, for the remainder of the change sources, results indicate a significant difference in cost, value to the customer and management considerations. Findings show that higher cost and value changes arose more often from ‘organisation’ and ‘vision’ sources; these changes also generally involved the co-operation of more stakeholder groups and were considered to be less controllable than changes arising from the ‘specification’ or ‘solution’ sources. Results from the second case study indicate that only ‘requirements dependency’ is consistently correlated with volatility and that changes coming from each change source affect different groups of requirements. We conclude that the taxonomy can provide a meaningful means of change classification, but that a single requirement attribute is insufficient for change prediction. A theoretical causal account of requirements change is drawn from the implications of the combined results of the two case studies.
Resumo:
Free fatty acid receptor 2 (FFA2; GPR43) is a G protein-coupled seven-transmembrane receptor for short-chain fatty acids (SCFAs) that is implicated in inflammatory and metabolic disorders. The SCFA propionate has close to optimal ligand efficiency for FFA2 and can hence be considered as highly potent given its size. Propionate, however, does not discriminate between FFA2 and the closely related receptor FFA3 (GPR41). To identify FFA2-selective ligands and understand the molecular basis for FFA2 selectivity, a targeted library of small carboxylic acids was examined using holistic, label-free dynamic mass redistribution technology for primary screening and the receptor-proximal G protein [S-35] guanosine 5'-(3-O-thio) triphosphate activation, inositol phosphate, and cAMP accumulation assays for hit confirmation. Structure-activity relationship analysis allowed formulation of a general rule to predict selectivity for small carboxylic acids at the orthosteric binding site where ligands with substituted sp(3)-hybridized alpha-carbons preferentially activate FFA3, whereas ligands with sp(2)- or sp-hybridized alpha-carbons prefer FFA2. The orthosteric binding mode was verified by site-directed mutagenesis: replacement of orthosteric site arginine residues by alanine in FFA2 prevented ligand binding, and molecular modeling predicted the detailed mode of binding. Based on this, selective mutation of three residues to their non-conserved counterparts in FFA3 was sufficient to transfer FFA3 selectivity to FFA2. Thus, selective activation of FFA2 via the orthosteric site is achievable with rather small ligands, a finding with significant implications for the rational design of therapeutic compounds selectively targeting the SCFA receptors.
Resumo:
This manuscript describes how motor behaviour researchers who are not at the same time expert roboticists may implement an experimental apparatus, which has the ability to dictate torque fields around a single joint on one limb or single joints on multiple limbs without otherwise interfering with the inherent dynamics of those joints. Such an apparatus expands the exploratory potential of the researcher wherever experimental distinction of factors may necessitate independent control of torque fields around multiple limbs, or the shaping of torque fields of a given joint independently of its plane of motion, or its directional phase within that plane. The apparatus utilizes torque motors. The challenge with torque motors is that they impose added inertia on limbs and thus attenuate joint dynamics. We eliminated this attenuation by establishing an accurate mathematical model of the robotic device using the Box-Jenkins method, and cancelling out its dynamics by employing the inverse of the model as a compensating controller. A direct measure of the remnant inertial torque as experienced by the hand during a 50 s period of wrist oscillations that increased gradually in frequency from 1.0 to 3.8 Hz confirmed that the removal of the inertial effect of the motor was effectively complete.