924 resultados para Specification
Resumo:
The conventional radial basis function (RBF) network optimization methods, such as orthogonal least squares or the two-stage selection, can produce a sparse network with satisfactory generalization capability. However, the RBF width, as a nonlinear parameter in the network, is not easy to determine. In the aforementioned methods, the width is always pre-determined, either by trial-and-error, or generated randomly. Furthermore, all hidden nodes share the same RBF width. This will inevitably reduce the network performance, and more RBF centres may then be needed to meet a desired modelling specification. In this paper we investigate a new two-stage construction algorithm for RBF networks. It utilizes the particle swarm optimization method to search for the optimal RBF centres and their associated widths. Although the new method needs more computation than conventional approaches, it can greatly reduce the model size and improve model generalization performance. The effectiveness of the proposed technique is confirmed by two numerical simulation examples.
Resumo:
We propose an exchange rate model that is a hybrid of the conventional specification with monetary fundamentals and the Evans–Lyons microstructure approach. We estimate a model augmented with order flow variables, using a unique data set: almost 100 monthly observations on interdealer order flow on dollar/euro and dollar/yen. The augmented macroeconomic, or “hybrid,” model exhibits greater in-sample stability and out of sample forecasting improvement vis-à-vis the basic macroeconomic and random walk specifications.
Resumo:
As a class of defects in software requirements specification, inconsistency has been widely studied in both requirements engineering and software engineering. It has been increasingly recognized that maintaining consistency alone often results in some other types of non-canonical requirements, including incompleteness of a requirements specification, vague requirements statements, and redundant requirements statements. It is therefore desirable for inconsistency handling to take into account the related non-canonical requirements in requirements engineering. To address this issue, we propose an intuitive generalization of logical techniques for handling inconsistency to those that are suitable for managing non-canonical requirements, which deals with incompleteness and redundancy, in addition to inconsistency. We first argue that measuring non-canonical requirements plays a crucial role in handling them effectively. We then present a measure-driven logic framework for managing non-canonical requirements. The framework consists of five main parts, identifying non-canonical requirements, measuring them, generating candidate proposals for handling them, choosing commonly acceptable proposals, and revising them according to the chosen proposals. This generalization can be considered as an attempt to handle non-canonical requirements along with logic-based inconsistency handling in requirements engineering.
Towards an understanding of the causes and effects of software requirements change: two case studies
Resumo:
Changes to software requirements not only pose a risk to the successful delivery of software applications but also provide opportunity for improved usability and value. Increased understanding of the causes and consequences of change can support requirements management and also make progress towards the goal of change anticipation. This paper presents the results of two case studies that address objectives arising from that ultimate goal. The first case study evaluated the potential of a change source taxonomy containing the elements ‘market’, ‘organisation’, ‘vision’, ‘specification’, and ‘solution’ to provide a meaningful basis for change classification and measurement. The second case study investigated whether the requirements attributes of novelty, complexity, and dependency correlated with requirements volatility. While insufficiency of data in the first case study precluded an investigation of changes arising due to the change source of ‘market’, for the remainder of the change sources, results indicate a significant difference in cost, value to the customer and management considerations. Findings show that higher cost and value changes arose more often from ‘organisation’ and ‘vision’ sources; these changes also generally involved the co-operation of more stakeholder groups and were considered to be less controllable than changes arising from the ‘specification’ or ‘solution’ sources. Results from the second case study indicate that only ‘requirements dependency’ is consistently correlated with volatility and that changes coming from each change source affect different groups of requirements. We conclude that the taxonomy can provide a meaningful means of change classification, but that a single requirement attribute is insufficient for change prediction. A theoretical causal account of requirements change is drawn from the implications of the combined results of the two case studies.
Resumo:
Details are presented of the DAC (DSP ASIC Compiler) silicon compiler framework. DAC allows a non-specialist to automatically design DSP ASICs and DSP ASIC cores directly form a high level specification. Typical designs take only several minutes and the resulting layouts are comparable in area and performance to handcrafted designs.
Resumo:
A generator for the automated design of Discrete Cosine Transform (DCT) cores is presented. This can be used to rapidly create silicon circuits from a high level specification. These compare very favourably with existing designs. The DCT cores produced are scaleable in terms of point size as well as input/output and coefficient wordlengths. This provides a high degree of flexibility. An example, 8-point 1D DCT design, produced occupies less than 0.92 mm when implemented in a 0.35µ double level metal CMOS technology. This can be clocked at a rate of 100MHz.
Resumo:
Using conjoint choice experiments, we surveyed 473 Swiss homeowners about their preferences for energy efficiency home renovations.We find that homeowners are responsive to the upfront costs of the renovation projects, governmentoffered rebates, savings in energy expenses, time horizon over which such savings would be realized, and thermal comfort improvement. The implicit discount rate is low, ranging from 1.5 to 3%, depending on model specification. This is consistent with Hassett and Metcalf (1993) and Metcalf and Rosenthal (1995), and with the fact that our scenarios contain no uncertainty. Respondents who feel completely uncertain about future energy prices are more likely to select the status quo (no renovations) in any given choice task and weight the costs of the investments more heavily than the financial gains (subsidies and savings on the energy bills). Renovations are more likely when respondents believe that climate change considerations are important determinants of home renovations. Copyright © 2013 by the IAEE. All rights reserved.
Resumo:
Explored the facial and cry characteristics that adults use when judging an infant's pain. Sixteen women viewed videotaped reactions of 36 newborns subjected to noninvasive thigh rubs and vitamin K injections in the course of routine care and rated discomfort. The group mean interrater reliability was high. Detailed descriptions of the infants' facial reactions and cry sounds permitted specification of the determinants of distress judgments. Several facial variables (a brow bulge, eyes squeezed shut, and deepened nasolabial fold constellation, and taut tongue) accounted for 49% of the variance in ratings of affective discomfort after controlling for ratings of discomfort during a noninvasive event. In a separate analysis not including facial activity, several cry variables (formant frequency, latency to cry) also accounted for variance (38%) in ratings. When the facial and cry variables were considered together, cry variables added little to the prediction of ratings in comparison to facial variables. Cry would seem to command attention, but facial activity, rather than cry, can account for the major variations in adults' judgments of neonatal pain.
Resumo:
Task dataflow languages simplify the specification of parallel programs by dynamically detecting and enforcing dependencies between tasks. These languages are, however, often restricted to a single level of parallelism. This language design is reflected in the runtime system, where a master thread explicitly generates a task graph and worker threads execute ready tasks and wake-up their dependents. Such an approach is incompatible with state-of-the-art schedulers such as the Cilk scheduler, that minimize the creation of idle tasks (work-first principle) and place all task creation and scheduling off the critical path. This paper proposes an extension to the Cilk scheduler in order to reconcile task dependencies with the work-first principle. We discuss the impact of task dependencies on the properties of the Cilk scheduler. Furthermore, we propose a low-overhead ticket-based technique for dependency tracking and enforcement at the object level. Our scheduler also supports renaming of objects in order to increase task-level parallelism. Renaming is implemented using versioned objects, a new type of hyper object. Experimental evaluation shows that the unified scheduler is as efficient as the Cilk scheduler when tasks have no dependencies. Moreover, the unified scheduler is more efficient than SMPSS, a particular implementation of a task dataflow language.
Resumo:
Abstract
BACKGROUND:
Glaucoma is a leading cause of blindness. Early detection is advocated but there is insufficient evidence from randomized controlled trials (RCTs) to inform health policy on population screening. Primarily, there is no agreed screening intervention. For a screening programme, agreement is required on the screening tests to be used, either individually or in combination, the person to deliver the test and the location where testing should take place. This study aimed to use ophthalmologists (who were experienced glaucoma subspecialists), optometrists, ophthalmic nurses and patients to develop a reduced set of potential screening tests and testing arrangements that could then be explored in depth in a further study of their feasibility for evaluation in a glaucoma screening RCT.
METHODS:
A two-round Delphi survey involving 38 participants was conducted. Materials were developed from a prior evidence synthesis. For round one, after some initial priming questions in four domains, specialists were asked to nominate three screening interventions, the intervention being a combination of the four domains; target population, (age and higher risk groups), site, screening test and test operator (provider). More than 250 screening interventions were identified. For round two, responses were condensed into 72 interventions and each was rated by participants on a 0-10 scale in terms of feasibility.
RESULTS:
Using a cut-off of a median rating of feasibility of =5.5 as evidence of agreement of intervention feasibility, six interventions were identified from round 2. These were initiating screening at age 50, with a combination of two or three screening tests (varying combinations of tonometry/measures of visual function/optic nerve damage) organized in a community setting with an ophthalmic trained technical assistant delivering the tests. An alternative intervention was a 'glaucoma risk score' ascertained by questionnaire. The advisory panel recommended that further exploration of the feasibility of screening higher risk populations and detailed specification of the screening tests was required.
CONCLUSIONS:
With systematic use of expert opinions, a shortlist of potential screening interventions was identified. Views of users, service providers and cost-effectiveness modeling are now required to identify a feasible intervention to evaluate in a future glaucoma screening trial.
Resumo:
The answer to the question of what it means to say that a right is absolute is often taken for granted, yet still sparks doubt and scepticism. This article investigates absoluteness further, bringing rights theory and the judicial approach on an absolute right together. A theoretical framework is set up that addresses two distinct but potentially related parameters of investigation: the first is what I have labelled the ‘applicability’ criterion, which looks at whether and when the applicability of the standard referred to as absolute can be displaced, in other words whether other considerations can justify its infringement; the second parameter, which I have labelled the ‘specification’ criterion, explores the degree to which and bases on which the content of the standard characterised as absolute is specified. This theoretical framework is then used to assess key principles and issues that arise in the Strasbourg Court’s approach to Article 3. It is suggested that this analysis allows us to explore both the distinction and the interplay between the two parameters in the judicial interpretation of the right and that appreciating the significance of this is fundamental to the understanding of and discourse on the concept of an absolute right.
Resumo:
The microtubule-associated protein, MAP65, is a member of a family of divergent microtubule-associated proteins from different organisms generally involved in maintaining the integrity of the central spindle in mitosis. The dicotyledon Arabidopsis thaliana and the monocotyledon rice (Oryza sativa) genomes contain 9 and 11 MAP65 genes, respectively. In this work, we show that the majority of these proteins fall into five phylogenetic clades, with the greatest variation between clades being in the C-terminal random coil domain. At least one Arabidopsis and one rice isotype is within each clade, indicating a functional specification for the C terminus. In At MAP65-1, the C-terminal domain is a microtubule binding region (MTB2) harboring the phosphorylation sites that control its activity. The At MAP65 isotypes show differential localization to microtubule arrays and promote microtubule polymerization with variable efficiency in a MTB2-dependent manner. In vivo studies demonstrate that the dynamics of the association and dissociation of different MAP65 isotypes with microtubules can vary up to 10-fold and that this correlates with their ability to promote microtubule polymerization. Our data demonstrate that the C-terminal variable region, MTB2, determines the dynamic properties of individual isotypes and suggest that slower turnover is conditional for more efficient microtubule polymerization.
Resumo:
We describe, for the first time, the microbial characterisation of hydrogel-forming polymeric microneedle arrays and the potential for passage of microorganisms into skin following microneedle penetration. Uniquely, we also present insights into the storage stability of these hydroscopic formulations, from physical and microbiological viewpoints, and examine clinical performance and safety in human volunteers. Experiments employing excised porcine skin and radiolabelled microorganisms showed that microorganisms can penetrate skin beyond the stratum corneum following microneedle puncture. Indeed, the numbers of microorganisms crossing the stratum corneum following microneedle puncture were greater than 105 cfu in each case. However, no microorganisms crossed the epidermal skin. When using a 21G hypodermic needle, more than 104 microorganisms penetrated into the viable tissue and 106 cfu of Candida albicans and Staphylococcus epidermidis completely crossed the epidermal skin in 24 h. The hydrogel-forming materials contained no microorganisms following de-moulding and exhibited no microbial growth during storage, while also maintaining their mechanical strength, apart from when stored at relative humidities of 86%. No microbial penetration through the swelling microneedles was detectable, while human volunteer studies confirmed that skin or systemic infection is highly unlikely when polymeric microneedles are used for transdermal drug delivery. Since no pharmacopoeial standards currently exist for microneedle-based products, the exact requirements for a proprietary product based on hydrogel-forming microneedles are at present unclear. However, we are currently working towards a comprehensive specification set for this microneedle system that may inform future developments in this regard.
Resumo:
This paper estimates the marginal willingness-to-pay for attributes of a hypothetical HIV vaccine using discrete choice modeling. We use primary data from 326 respondents from Bangkok and Chiang Mai, Thailand, in 2008–2009, selected using purposive, venue-based sampling across two strata. Participants completed a structured questionnaire and full rank discrete choice modeling task administered using computer-assisted personal interviewing. The choice experiment was used to rank eight hypothetical HIV vaccine scenarios, with each scenario comprising seven attributes (including cost) each of which had two levels. The data were analyzed in two alternative specifications: (1) best-worst; and (2) full-rank, using logit likelihood functions estimated with custom routines in Gauss matrix programming language. In the full-rank specification, all vaccine attributes are significant predictors of probability of vaccine choice. The biomedical attributes of the hypothetical HIV vaccine (efficacy, absence of VISP, absence of side effects, and duration of effect) are the most important attributes for HIV vaccine choice. On average respondents are more than twice as likely to accept a vaccine with 99% efficacy, than a vaccine with 50% efficacy. This translates to a willingness to pay US$383 more for a high efficacy vaccine compared with the low efficacy vaccine. Knowledge of the relative importance of determinants of HIV vaccine acceptability is important to ensure the success of future vaccination programs. Future acceptability studies of hypothetical HIV vaccines should use more finely grained biomedical attributes, and could also improve the external validity of results by including more levels of the cost attribute.
Resumo:
The ecological footprint of food transport can be communicated using carbon dioxide emissions (CO2 label) or by providing information about both the length of time and the mileage travelled (food miles label). We use stated choice data to estimate conventional unobserved taste heterogeneity models and extend them to a specification that also addresses attribute nonattendance. The implied posterior distributions of the marginal willingness to pay values are compared graphically and are used in validation regressions. We find strong bimodality of taste distribution as the emerging feature, with different groups of subjects having low and high valuations for these labels. The best fitting model shows that CO2 and food miles valuations are much correlated. CO2 valuations can be high even for those respondents expressing low valuations for food miles. However, the reverse is not true. Taken together, the results suggest that consumers tend to value the CO2 label at least as much and sometimes more than the food miles label.