913 resultados para Variable design parameters
Resumo:
The complexity of construction projects and the fragmentation of the construction industry undertaking those projects has effectively resulted in linear, uncoordinated and highly variable project processes in the UK construction sector. Research undertaken at the University of Salford resulted in the development of an improved project process, the Process Protocol, which considers the whole lifecycle of a construction project whilst integrating its participants under a common framework. The Process Protocol identifies the various phases of a construction project with particular emphasis on what is described in the manufacturing industry as the ‘fuzzy front end’. The participants in the process are described in terms of the activities that need to be undertaken in order to achieve a successful project and process execution. In addition, the decision-making mechanisms, from a client perspective, are illustrated and the foundations for a learning organization/industry are facilitated within a consistent Process Protocol.
Resumo:
A very efficient learning algorithm for model subset selection is introduced based on a new composite cost function that simultaneously optimizes the model approximation ability and model robustness and adequacy. The derived model parameters are estimated via forward orthogonal least squares, but the model subset selection cost function includes a D-optimality design criterion that maximizes the determinant of the design matrix of the subset to ensure the model robustness, adequacy, and parsimony of the final model. The proposed approach is based on the forward orthogonal least square (OLS) algorithm, such that new D-optimality-based cost function is constructed based on the orthogonalization process to gain computational advantages and hence to maintain the inherent advantage of computational efficiency associated with the conventional forward OLS approach. Illustrative examples are included to demonstrate the effectiveness of the new approach.
Resumo:
A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.
Resumo:
A very efficient learning algorithm for model subset selection is introduced based on a new composite cost function that simultaneously optimizes the model approximation ability and model adequacy. The derived model parameters are estimated via forward orthogonal least squares, but the subset selection cost function includes an A-optimality design criterion to minimize the variance of the parameter estimates that ensures the adequacy and parsimony of the final model. An illustrative example is included to demonstrate the effectiveness of the new approach.
Resumo:
This paper discusses how the use of computer-based modelling tools has aided the design of a telemetry unit for use with oil well logging. With the aid of modern computer-based simulation techniques, the new design is capable of operating at data rates of 2.5 times faster than previous designs.
Resumo:
The EC Regulation No. 1924/2006 on Nutrition and Health claims made on foods has generated considerable debate and concern among scientists and industry. At the time of writing, the European Food Safety Authority (EFSA) has not approved any probiotic claims despite numerous human trials and meta-analyses showing evidence of beneficial effects. On 29th and 30th September 2010, ten independent, academic scientists with a documented record in probiotic research, met to discuss designs for future probiotic studies to demonstrate health benefits for gut and immune function. The expert panel recommended the following: (i) always formulate a precise and concrete hypothesis, and appropriate goals and parameters before starting a trial; (ii) ensure trials have sufficient sample size, such that they are adequately powered to reach statistically significant conclusions, either supporting or rejecting the a priori hypothesis, taking into account adjustment for multiple testing (this might necessitate more than one recruitment site); (iii) ensure trials are of appropriate duration; (iv) focus on a single, primary objective and only evaluate multiple parameters when they are hypothesis-driven. The panel agreed that there was an urgent need to better define which biomarkers are considered valuable for substantiation of a health claim. As a first step, the panel welcomed the publication on the day of the meeting of EFSA's draft guidance document on immune and gut health, although it came too late for study designs and dossiers to be adjusted accordingly. New validated biomarkers need to be identified in order to properly determine the range of physiological functions influenced by probiotics. In addition, validated biomarkers reflecting risk factors for disease, are required for article 14 claims (EC Regulation No. 1924/2006). Finally, the panel concluded that consensus among scientists is needed to decide appropriate clinical endpoints for trials.
Resumo:
As in any technology systems, analysis and design issues are among the fundamental challenges in persuasive technology. Currently, the Persuasive Systems Development (PSD) framework is considered to be the most comprehensive framework for designing and evaluation of persuasive systems. However, the framework is limited in terms of providing detailed information which can lead to selection of appropriate techniques depending on the variable nature of users or use over time. In light of this, we propose a model which is intended for analysing and implementing behavioural change in persuasive technology called the 3D-RAB model. The 3D-RAB model represents the three dimensional relationships between attitude towards behaviour, attitude towards change or maintaining a change, and current behaviour, and distinguishes variable levels in a user’s cognitive state. As such it provides a framework which could be used to select appropriate techniques for persuasive technology.
Resumo:
This paper presents a completely new design of a bogie-frame made of glass fibre reinforced composites and its performance under various loading conditions predicted by finite element analysis. The bogie consists of two frames, with one placed on top of the other, and two axle ties connecting the axles. Each frame consists of two side arms and a transom between. The top frame is thinner and more compliant and has a higher curvature compared with the bottom frame. Variable vertical stiffness can be achieved before and after the contact between the two frames at the central section of the bogie to cope with different load levels. Finite element analysis played a very important role in the design of this structure. Stiffness and stress levels of the full scale bogie presented in this paper under various loading conditions have been predicted by using Marc provided by MSC Software. In order to verify the finite element analysis (FEA) models, a fifth scale prototype of the bogie has been made and tested under quasi-static loading conditions. Results of testing on the fifth scale bogie have been used to fine tune details like contact and friction in the fifth scale FEA models. These conditions were then applied to the full scale models. Finite element analysis results show that the stress levels in all directions are low compared with material strengths.
Resumo:
Robustness in multi-variable control system design requires that the solution to the design problem be insensitive to perturbations in the system data. In this paper we discuss measures of robustness for generalized state-space, or descriptor, systems and describe algorithmic techniques for optimizing robustness for various applications.
Resumo:
Physiological and yield traits such as stomatal conductance (mmol m-2s-1), Leaf relative water content (RWC %) and grain yield per plant were studied in a separate experiment. Results revealed that five out of sixteen cultivars viz. Anmol, Moomal, Sarsabz, Bhitai and Pavan, appeared to be relatively more drought tolerant. Based on morphophysiological results, studies were continued to look at these cultivars for drought tolerance at molecular level. Initially, four well recognized primers for dehydrin genes (DHNs) responsible for drought induction in T. durum L., T. aestivum L. and O. sativa L. were used for profiling gene sequence of sixteen wheat cultivars. The primers amplified the DHN genes variably like Primer WDHN13 (T. aestivum L.) amplified the DHN gene in only seven cultivars whereas primer TdDHN15 (T. durum L.) amplified all the sixteen cultivars with even different DNA banding patterns some showing second weaker DNA bands. Third primer TdDHN16 (T. durum L.) has shown entirely different PCR amplification prototype, specially showing two strong DNA bands while fourth primer RAB16C (O. sativa L.) failed to amplify DHN gene in any of the cultivars. Examination of DNA sequences revealed several interesting features. First, it identified the two exon/one intron structure of this gene (complete sequences were not shown), a feature not previously described in the two database cDNA sequences available from T. aestivum L. (gi|21850). Secondly, the analysis identified several single nucleotide polymorphisms (SNPs), positions in gene sequence. Although complete gene sequence was not obtained for all the cultivars, yet there were a total of 38 variable positions in exonic (coding region) sequence, from a total gene length of 453 nucleotides. Matrix of SNP shows these 37 positions with individual sequence at positions given for each of the 14 cultivars (sequence of two cultivars was not obtained) included in this analysis. It demonstrated a considerable diversity for this gene with only three cultivars i.e. TJ-83, Marvi and TD-1 being similar to the consensus sequence. All other cultivars showed a unique combination of SNPs. In order to prove a functional link between these polymorphisms and drought tolerance in wheat, it would be necessary to conduct a more detailed study involving directed mutation of this gene and DHN gene expression.
Resumo:
As in any technology systems, analysis and design issues are among the fundamental challenges in persuasive technology. Currently, the Persuasive Systems Development (PSD) framework is considered to be the most comprehensive framework for designing and evaluation of persuasive systems. However, the framework is limited in terms of providing detailed information which can lead to selection of appropriate techniques depending on the variable nature of users or use over time. In light of this, we propose a model which is intended for analysing and implementing behavioural change in persuasive technology called the 3D-RAB model. The 3D-RAB model represents the three dimensional relationships between attitude towards behaviour, attitude towards change or maintaining a change, and current behaviour, and distinguishes variable levels in a user’s cognitive state. As such it provides a framework which could be used to select appropriate techniques for persuasive technology.
Resumo:
This study attempts to fill the existing gap in the simulation of variable flow distribution systems through developing new pressure governing components. These components are able to capture the actual ever-changing system performance curve in variable flow distribution systems together with the prediction of controversial issues such as starving, over-flow and the lack of controllability on the flow rate of different branches in a hydronic system. The performance of the proposed components is verified using a case study under design and off-design circumstances. Full integration of the new components within the TRNSYS simulation package is another advantage of this study, which makes it more applicable for designers in both the design and commissioning of hydronic systems.
Resumo:
As laid out in its convention there are 8 different objectives for ECMWF. One of the major objectives will consist of the preparation, on a regular basis, of the data necessary for the preparation of medium-range weather forecasts. The interpretation of this item is that the Centre will make forecasts once a day for a prediction period of up to 10 days. It is also evident that the Centre should not carry out any real weather forecasting but merely disseminate to the member countries the basic forecasting parameters with an appropriate resolution in space and time. It follows from this that the forecasting system at the Centre must from the operational point of view be functionally integrated with the Weather Services of the Member Countries. The operational interface between ECMWF and the Member Countries must be properly specified in order to get a reasonable flexibility for both systems. The problem of making numerical atmospheric predictions for periods beyond 4-5 days differs substantially from 2-3 days forecasting. From the physical point we can define a medium range forecast as a forecast where the initial disturbances have lost their individual structure. However we are still interested to predict the atmosphere in a similar way as in short range forecasting which means that the model must be able to predict the dissipation and decay of the initial phenomena and the creation of new ones. With this definition, medium range forecasting is indeed very difficult and generally regarded as more difficult than extended forecasts, where we usually only predict time and space mean values. The predictability of atmospheric flow has been extensively studied during the last years in theoretical investigations and by numerical experiments. As has been discussed elsewhere in this publication (see pp 338 and 431) a 10-day forecast is apparently on the fringe of predictability.
Cross-layer design for MIMO systems over spatially correlated and keyhole Nakagami-m fading channels
Resumo:
Cross-layer design is a generic designation for a set of efficient adaptive transmission schemes, across multiple layers of the protocol stack, that are aimed at enhancing the spectral efficiency and increasing the transmission reliability of wireless communication systems. In this paper, one such cross-layer design scheme that combines physical layer adaptive modulation and coding (AMC) with link layer truncated automatic repeat request (T-ARQ) is proposed for multiple-input multiple-output (MIMO) systems employing orthogonal space--time block coding (OSTBC). The performance of the proposed cross-layer design is evaluated in terms of achievable average spectral efficiency (ASE), average packet loss rate (PLR) and outage probability, for which analytical expressions are derived, considering transmission over two types of MIMO fading channels, namely, spatially correlated Nakagami-m fading channels and keyhole Nakagami-m fading channels. Furthermore, the effects of the maximum number of ARQ retransmissions, numbers of transmit and receive antennas, Nakagami fading parameter and spatial correlation parameters, are studied and discussed based on numerical results and comparisons. Copyright © 2009 John Wiley & Sons, Ltd.
Resumo:
In 2006 the UK government announced a move to zero carbon homes by 2016. The demand posed a major challenge to policy makers and construction professionals entailing a protracted process of policy design. The task of giving content to this target is used to explore the role of evidence in the policy process. Whereas much literature on policy and evidence treats evidence as an external input, independent of politics, this paper explores the ongoing mutual constitution of both. Drawing on theories of policy framing and the sociology of classification, the account follows the story of a policy for Zero Carbon Homes from the parameters and values used to specify the target. Particular attention is given to the role of Regulatory Impact Assessments (RIAs) and to the creation of a new policy venue, the Zero Carbon Hub. The analysis underlines the way in which the choices about how to model and measure the aims potentially transforms them, the importance of policy venues for transparency and the role of RIAs in the authorization of particular definitions. A more transparent, open approach to policy formulation is needed in which the framing of evidence is recognized as an integral part of the policy process.