71 resultados para model-based security management


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The International Nusantara Stratification and Transport (INSTANT) program measured currents through multiple Indonesian Seas passages simultaneously over a three-year period (from January 2004 to December 2006). The Indonesian Seas region has presented numerous challenges for numerical modelers - the Indonesian Throughflow (ITF) must pass over shallow sills, into deep basins, and through narrow constrictions on its way from the Pacific to the Indian Ocean. As an important region in the global climate puzzle, a number of models have been used to try and best simulate this throughflow. In an attempt to validate our model, we present a comparison between the transports calculated from our model and those calculated from the INSTANT in situ measurements at five passages within the Indonesian Seas (Labani Channel, Lifamatola Passage, Lombok Strait, Ornbai Strait, and Timor Passage). Our Princeton Ocean Model (POM) based regional Indonesian Seas model was originally developed to analyze the influence of bottom topography on the temperature and salinity distributions in the Indonesian seas region, to disclose the path of the South Pacific Water from the continuation of the New Guinea Coastal Current entering the region of interest up to the Lifamatola Passage, and to assess the role of the pressure head in driving the ITF and in determining its total transport. Previous studies found that this model reasonably represents the general long-term flow (seasons) through this region. The INSTANT transports were compared to the results of this regional model over multiple timescales. Overall trends are somewhat represented but changes on timescales shorter than seasonal (three months) and longer than annual were not considered in our model. Normal velocities through each passage during every season are plotted. Daily volume transports and transport-weighted temperature and salinity are plotted and seasonal averages are tabulated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Quantitative reconstruction of past vegetation distribution and abundance from sedimentary pollen records provides an important baseline for understanding long term ecosystem dynamics and for the calibration of earth system process models such as regional-scale climate models, widely used to predict future environmental change. Most current approaches assume that the amount of pollen produced by each vegetation type, usually expressed as a relative pollen productivity term, is constant in space and time.
2. Estimates of relative pollen productivity can be extracted from extended R-value analysis (Parsons and Prentice, 1981) using comparisons between pollen assemblages deposited into sedimentary contexts, such as moss polsters, and measurements of the present day vegetation cover around the sampled location. Vegetation survey method has been shown to have a profound effect on estimates of model parameters (Bunting and Hjelle, 2010), therefore a standard method is an essential pre-requisite for testing some of the key assumptions of pollen-based reconstruction of past vegetation; such as the assumption that relative pollen productivity is effectively constant in space and time within a region or biome.
3. This paper systematically reviews the assumptions and methodology underlying current models of pollen dispersal and deposition, and thereby identifies the key characteristics of an effective vegetation survey method for estimating relative pollen productivity in a range of landscape contexts.
4. It then presents the methodology used in a current research project, developed during a practitioner workshop. The method selected is pragmatic, designed to be replicable by different research groups, usable in a wide range of habitats, and requiring minimum effort to collect adequate data for model calibration rather than representing some ideal or required approach. Using this common methodology will allow project members to collect multiple measurements of relative pollen productivity for major plant taxa from several northern European locations in order to test the assumption of uniformity of these values within the climatic range of the main taxa recorded in pollen records from the region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a comparative newly-invented PKM with over-constraints in kinematic chains, the Exechon has attracted extensive attention from the research society. Different from the well-recognized kinematics analysis, the research on the stiffness characteristics of the Exechon still remains as a challenge due to the structural complexity. In order to achieve a thorough understanding of the stiffness characteristics of the Exechon PKM, this paper proposed an analytical kinetostatic model by using the substructure synthesis technique. The whole PKM system is decomposed into a moving platform subsystem, three limb subsystems and a fixed base subsystem, which are connected to each other sequentially through corresponding joints. Each limb body is modeled as a spatial beam with a uniform cross-section constrained by two sets of lumped springs. The equilibrium equation of each individual limb assemblage is derived through finite element formulation and combined with that of the moving platform derived with Newtonian method to construct the governing kinetostatic equations of the system after introducing the deformation compatibility conditions between the moving platform and the limbs. By extracting the 6 x 6 block matrix from the inversion of the governing compliance matrix, the stiffness of the moving platform is formulated. The computation for the stiffness of the Exechon PKM at a typical configuration as well as throughout the workspace is carried out in a quick manner with a piece-by-piece partition algorithm. The numerical simulations reveal a strong position-dependency of the PKM's stiffness in that it is symmetric relative to a work plane due to structural features. At the last stage, the effects of some design variables such as structural, dimensional and stiffness parameters on system rigidity are investigated with the purpose of providing useful information for the structural optimization and performance enhancement of the Exechon PKM. It is worthy mentioning that the proposed methodology of stiffness modeling in this paper can also be applied to other overconstrained PKMs and can evaluate the global rigidity over workplace efficiently with minor revisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Medical Research Council (MRC) guidelines recommend applying theory within complex interventions to explain how behaviour change occurs. Guidelines endorse self-management of chronic low back pain (CLBP) and osteoarthritis (OA), but evidence for its effectiveness is weak. Objective: This literature review aimed to determine the use of behaviour change theory and techniques within randomised controlled trials of group-based self-management programmes for chronic musculoskeletal pain, specifically CLBP and OA. Methods: A two-phase search strategy of electronic databases was used to identify systematic reviews and studies relevant to this area. Articles were coded for their use of behaviour change theory, and the number of behaviour change techniques (BCTs) was identified using a 93-item taxonomy, Taxonomy (v1). Results: 25 articles of 22 studies met the inclusion criteria, of which only three reported having based their intervention on theory, and all used Social Cognitive Theory. A total of 33 BCTs were coded across all articles with the most commonly identified techniques being '. instruction on how to perform the behaviour', '. demonstration of the behaviour', '. behavioural practice', '. credible source', '. graded tasks' and '. body changes'. Conclusion: Results demonstrate that theoretically driven research within group based self-management programmes for chronic musculoskeletal pain is lacking, or is poorly reported. Future research that follows recommended guidelines regarding the use of theory in study design and reporting is warranted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of regulating the rate of harvesting a natural resource, taking account of the wider system represented by a set of ecological and economic indicators, given differing stakeholder priorities. This requires objective and transparent decision making to show how indicators impinge on the resulting regulation decision. We offer a new scheme for combining indicators, derived from assessing the suitability of lowering versus not lowering the harvest rate based on indicator values relative to their predefined reference levels. Using the practical example of fisheries management under an “ecosystem approach,” we demonstrate how different stakeholder views can be quantitatively represented by weighting sets applied to these comparisons. Using the scheme in an analysis of historical data from the Celtic Sea fisheries, we find great scope for negotiating agreement among disparate stakeholders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A rich model based motion vector steganalysis benefiting from both temporal and spatial correlations of motion vectors is proposed in this work. The proposed steganalysis method has a substantially superior detection accuracy than the previous methods, even the targeted ones. The improvement in detection accuracy lies in several novel approaches introduced in this work. Firstly, it is shown that there is a strong correlation, not only spatially but also temporally, among neighbouring motion vectors for longer distances. Therefore, temporal motion vector dependency along side the spatial dependency is utilized for rigorous motion vector steganalysis. Secondly, unlike the filters previously used, which were heuristically designed against a specific motion vector steganography, a diverse set of many filters which can capture aberrations introduced by various motion vector steganography methods is used. The variety and also the number of the filter kernels are substantially more than that of used in previous ones. Besides that, filters up to fifth order are employed whereas the previous methods use at most second order filters. As a result of these, the proposed system captures various decorrelations in a wide spatio-temporal range and provides a better cover model. The proposed method is tested against the most prominent motion vector steganalysis and steganography methods. To the best knowledge of the authors, the experiments section has the most comprehensive tests in motion vector steganalysis field including five stego and seven steganalysis methods. Test results show that the proposed method yields around 20% detection accuracy increase in low payloads and 5% in higher payloads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim. The purpose of this study was to develop and evaluate a computer-based, dietary, and physical activity self-management program for people recently diagnosed with type 2 diabetes. 
Methods. The computer-based program was developed in conjunction with the target group and evaluated in a 12-week randomised controlled trial (RCT). Participants were randomised to the intervention (computer-program) or control group (usual care). Primary outcomes were diabetes knowledge and goal setting (ADKnowl questionnaire, Diabetes Obstacles Questionnaire (DOQ)) measured at baseline and week 12. User feedback on the program was obtained via a questionnaire and focus groups. Results. Seventy participants completed the 12-week RCT (32 intervention, 38 control, mean age 59 (SD) years). After completion there was a significant between-group difference in the “knowledge and beliefs scale” of the DOQ. Two-thirds of the intervention group rated the program as either good or very good, 92% would recommend the program to others, and 96% agreed that the information within the program was clear and easy to understand. 
Conclusions. The computer-program resulted in a small but statistically significant improvement in diet-related knowledge and user satisfaction was high. With some further development, this computer-based educational tool may be a useful adjunct to diabetes self-management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adjoint methods have proven to be an efficient way of calculating the gradient of an objective function with respect to a shape parameter for optimisation, with a computational cost nearly independent of the number of the design variables [1]. The approach in this paper links the adjoint surface sensitivities (gradient of objective function with respect to the surface movement) with the parametric design velocities (movement of the surface due to a CAD parameter perturbation) in order to compute the gradient of the objective function with respect to CAD variables.
For a successful implementation of shape optimization strategies in practical industrial cases, the choice of design variables or parameterisation scheme used for the model to be optimized plays a vital role. Where the goal is to base the optimization on a CAD model the choices are to use a NURBS geometry generated from CAD modelling software, where the position of the NURBS control points are the optimisation variables [2] or to use the feature based CAD model with all of the construction history to preserve the design intent [3]. The main advantage of using the feature based model is that the optimized model produced can be directly used for the downstream applications including manufacturing and process planning.
This paper presents an approach for optimization based on the feature based CAD model, which uses CAD parameters defining the features in the model geometry as the design variables. In order to capture the CAD surface movement with respect to the change in design variable, the “Parametric Design Velocity” is calculated, which is defined as the movement of the CAD model boundary in the normal direction due to a change in the parameter value.
The approach presented here for calculating the design velocities represents an advancement in terms of capability and robustness of that described by Robinson et al. [3]. The process can be easily integrated to most industrial optimisation workflows and is immune to the topology and labelling issues highlighted by other CAD based optimisation processes. It considers every continuous (“real value”) parameter type as an optimisation variable, and it can be adapted to work with any CAD modelling software, as long as it has an API which provides access to the values of the parameters which control the model shape and allows the model geometry to be exported. To calculate the movement of the boundary the methodology employs finite differences on the shape of the 3D CAD models before and after the parameter perturbation. The implementation procedure includes calculating the geometrical movement along a normal direction between two discrete representations of the original and perturbed geometry respectively. Parametric design velocities can then be directly linked with adjoint surface sensitivities to extract the gradients to use in a gradient-based optimization algorithm.
The optimisation of a flow optimisation problem is presented, in which the power dissipation of the flow in an automotive air duct is to be reduced by changing the parameters of the CAD geometry created in CATIA V5. The flow sensitivities are computed with the continuous adjoint method for a laminar and turbulent flow [4] and are combined with the parametric design velocities to compute the cost function gradients. A line-search algorithm is then used to update the design variables and proceed further with optimisation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A constrained non-linear, physical model-based, predictive control (NPMPC) strategy is developed for improved plant-wide control of a thermal power plant. The strategy makes use of successive linearisation and recursive state estimation using extended Kalman filtering to obtain a linear state-space model. The linear model and a quadratic programming routine are used to design a constrained long-range predictive controller One special feature is the careful selection of a specific set of plant model parameters for online estimation, to account for time-varying system characteristics resulting from major system disturbances and ageing. These parameters act as nonstationary stochastic states and help to provide sufficient degrees-of-freedom to obtain unbiased estimates of controlled outputs. A 14th order non-linear plant model, simulating the dominant characteristics of a 200 MW oil-fired pou er plant has been used to test the NPMPC algorithm. The control strategy gives impressive simulation results, during large system disturbances and extremely high rate of load changes, right across the operating range. These results compare favourably to those obtained with the state-space GPC method designed under similar conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Artificial neural networks (ANNs) can be easily applied to short-term load forecasting (STLF) models for electric power distribution applications. However, they are not typically used in medium and long term load forecasting (MLTLF) electric power models because of the difficulties associated with collecting and processing the necessary data. Virtual instrument (VI) techniques can be applied to electric power load forecasting but this is rarely reported in the literature. In this paper, we investigate the modelling and design of a VI for short, medium and long term load forecasting using ANNs. Three ANN models were built for STLF of electric power. These networks were trained using historical load data and also considering weather data which is known to have a significant affect of the use of electric power (such as wind speed, precipitation, atmospheric pressure, temperature and humidity). In order to do this a V-shape temperature processing model is proposed. With regards MLTLF, a model was developed using radial basis function neural networks (RBFNN). Results indicate that the forecasting model based on the RBFNN has a high accuracy and stability. Finally, a virtual load forecaster which integrates the VI and the RBFNN is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is convenient and effective to solve nonlinear problems with a model that has a linear-in-the-parameters (LITP) structure. However, the nonlinear parameters (e.g. the width of Gaussian function) of each model term needs to be pre-determined either from expert experience or through exhaustive search. An alternative approach is to optimize them by a gradient-based technique (e.g. Newton’s method). Unfortunately, all of these methods still need a lot of computations. Recently, the extreme learning machine (ELM) has shown its advantages in terms of fast learning from data, but the sparsity of the constructed model cannot be guaranteed. This paper proposes a novel algorithm for automatic construction of a nonlinear system model based on the extreme learning machine. This is achieved by effectively integrating the ELM and leave-one-out (LOO) cross validation with our two-stage stepwise construction procedure [1]. The main objective is to improve the compactness and generalization capability of the model constructed by the ELM method. Numerical analysis shows that the proposed algorithm only involves about half of the computation of orthogonal least squares (OLS) based method. Simulation examples are included to confirm the efficacy and superiority of the proposed technique.