50 resultados para Validation and certification competences process


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There is general agreement across all interested parties that a process of working together is the best way to determine which school or educational setting is right for an individual child with autism spectrum disorder. In the UK, families and local authorities both desire a constructive working relationship and see this as the best means by which to reach an agreement to determine where a child should be educated. It has been shown in published works 1 1. Batten and colleagues (Make schools make sense. Autism and education: the reality for families today; London: The National Autistic Society, 2006). View all notes that a constructive working relationship is not always achieved. Purpose: This small-scale study aims to explore the views of both parents and local authorities, focussing on how both parties perceive and experience the process of determining educational provision for children with autism spectrum disorders (ASD) within an English context. Sample, design and method: Parental opinion was gathered through the use of a questionnaire with closed and open responses. The questionnaire was distributed to two national charities, two local charities and 16 specialist schools, which offered the questionnaire to parents of children with ASD, resulting in an opportunity sample of 738 returned surveys. The views of local authority personnel from five local authorities were gathered through the use of semi-structured interviews. Data analyses included quantitative analysis of the closed response questionnaire items, and theme-based qualitative analysis of the open responses and interviews with local authority personnel. Results: In the majority of cases, parents in the survey obtained their first choice placement for their child. Despite this positive outcome, survey data indicated that parents found the process bureaucratic, stressful and time consuming. Parents tended to perceive alternative placement suggestions as financially motivated rather than in the best interests of the child. Interviews with local authority personnel showed an awareness of these concerns and the complex considerations involved in determining what is best for an individual child. Conclusions: This small-scale study highlights the need for more effective communication between parents of children with ASDs and local authority personnel at all stages of the process

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Key Performance Indicators (KPIs) are the main instruments of Business Performance Management. KPIs are the measures that are translated to both the strategy and the business process. These measures are often designed for an industry sector with the assumptions about business processes in organizations. However, the assumptions can be too incomplete to guarantee the required properties of KPIs. This raises the need to validate the properties of KPIs prior to their application to performance measurement. This paper applies the method called EXecutable Requirements Engineering Management and Evolution (EXTREME) for validation of the KPI definitions. EXTREME semantically relates the goal modeling, conceptual modeling and protocol modeling techniques into one methodology. The synchronous composition built into protocol modeling enables raceability of goals in protocol models and constructive definitions of a KPI. The application of the method clarifies the meaning of KPI properties and procedures of their assessment and validation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the many models developed for phosphorus concentration prediction at differing spatial and temporal scales, there has been little effort to quantify uncertainty in their predictions. Model prediction uncertainty quantification is desirable, for informed decision-making in river-systems management. An uncertainty analysis of the process-based model, integrated catchment model of phosphorus (INCA-P), within the generalised likelihood uncertainty estimation (GLUE) framework is presented. The framework is applied to the Lugg catchment (1,077 km2), a River Wye tributary, on the England–Wales border. Daily discharge and monthly phosphorus (total reactive and total), for a limited number of reaches, are used to initially assess uncertainty and sensitivity of 44 model parameters, identified as being most important for discharge and phosphorus predictions. This study demonstrates that parameter homogeneity assumptions (spatial heterogeneity is treated as land use type fractional areas) can achieve higher model fits, than a previous expertly calibrated parameter set. The model is capable of reproducing the hydrology, but a threshold Nash-Sutcliffe co-efficient of determination (E or R 2) of 0.3 is not achieved when simulating observed total phosphorus (TP) data in the upland reaches or total reactive phosphorus (TRP) in any reach. Despite this, the model reproduces the general dynamics of TP and TRP, in point source dominated lower reaches. This paper discusses why this application of INCA-P fails to find any parameter sets, which simultaneously describe all observed data acceptably. The discussion focuses on uncertainty of readily available input data, and whether such process-based models should be used when there isn’t sufficient data to support the many parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Severe wind storms are one of the major natural hazards in the extratropics and inflict substantial economic damages and even casualties. Insured storm-related losses depend on (i) the frequency, nature and dynamics of storms, (ii) the vulnerability of the values at risk, (iii) the geographical distribution of these values, and (iv) the particular conditions of the risk transfer. It is thus of great importance to assess the impact of climate change on future storm losses. To this end, the current study employs—to our knowledge for the first time—a coupled approach, using output from high-resolution regional climate model scenarios for the European sector to drive an operational insurance loss model. An ensemble of coupled climate-damage scenarios is used to provide an estimate of the inherent uncertainties. Output of two state-of-the-art global climate models (HadAM3, ECHAM5) is used for present (1961–1990) and future climates (2071–2100, SRES A2 scenario). These serve as boundary data for two nested regional climate models with a sophisticated gust parametrizations (CLM, CHRM). For validation and calibration purposes, an additional simulation is undertaken with the CHRM driven by the ERA40 reanalysis. The operational insurance model (Swiss Re) uses a European-wide damage function, an average vulnerability curve for all risk types, and contains the actual value distribution of a complete European market portfolio. The coupling between climate and damage models is based on daily maxima of 10 m gust winds, and the strategy adopted consists of three main steps: (i) development and application of a pragmatic selection criterion to retrieve significant storm events, (ii) generation of a probabilistic event set using a Monte-Carlo approach in the hazard module of the insurance model, and (iii) calibration of the simulated annual expected losses with a historic loss data base. The climate models considered agree regarding an increase in the intensity of extreme storms in a band across central Europe (stretching from southern UK and northern France to Denmark, northern Germany into eastern Europe). This effect increases with event strength, and rare storms show the largest climate change sensitivity, but are also beset with the largest uncertainties. Wind gusts decrease over northern Scandinavia and Southern Europe. Highest intra-ensemble variability is simulated for Ireland, the UK, the Mediterranean, and parts of Eastern Europe. The resulting changes on European-wide losses over the 110-year period are positive for all layers and all model runs considered and amount to 44% (annual expected loss), 23% (10 years loss), 50% (30 years loss), and 104% (100 years loss). There is a disproportionate increase in losses for rare high-impact events. The changes result from increases in both severity and frequency of wind gusts. Considerable geographical variability of the expected losses exists, with Denmark and Germany experiencing the largest loss increases (116% and 114%, respectively). All countries considered except for Ireland (−22%) experience some loss increases. Some ramifications of these results for the socio-economic sector are discussed, and future avenues for research are highlighted. The technique introduced in this study and its application to realistic market portfolios offer exciting prospects for future research on the impact of climate change that is relevant for policy makers, scientists and economists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We give an overview on the development of "horizontal" European Committee for Standardisation (CEN) standards for characterising soils, sludges and biowaste in the context of environmental legislation in the European Union (EU). We discuss the various steps in the development of a horizontal standard (i.e. assessment of the possibility of such a standard, review of existing normative documents, pre-normative testing and validation) and related problems. We also provide a synopsis of European and international standards covered by the so-called Project HORIZONTAL. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates and evaluates the process of knowledge transfer in construction projects. Due to the highly competitive nature of business environments, knowledge transfer between organisations has become increasingly popular in recent years. However, although organisations can realise remarkable benefits by transferring knowledge from one unit to another, successful knowledge transfer can be difficult to achieve. The discussions presented in the paper are mainly based on findings of two case studies. The two cases were selected from Private Finance Initiative (PFI) projects in the UK. According to the case study findings, different stages of a knowledge transfer process can be overlapped, omitted, repeated as well as intermitted and then restarted. One of the significant findings of the case studies was the role of the "knowledge mediator". In selected case studies, there were external consultants and expert staff in the form of knowledge mediators. The importance of their roles was frequently highlighted by the interview participants. They were not only facilitating the close liaison between the knowledge source and the receiver, but also their role was strongly associated with practices of translation and interpretation. This combined role of mediator/translator, therefore, appears to be particularly significant for inter-organisational knowledge transfer in PFI projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a new iterative algorithm for OFDM joint data detection and phase noise (PHN) cancellation based on minimum mean square prediction error. We particularly highlight the problem of "overfitting" such that the iterative approach may converge to a trivial solution. Although it is essential for this joint approach, the overfitting problem was relatively less studied in existing algorithms. In this paper, specifically, we apply a hard decision procedure at every iterative step to overcome the overfitting. Moreover, compared with existing algorithms, a more accurate Pade approximation is used to represent the phase noise, and finally a more robust and compact fast process based on Givens rotation is proposed to reduce the complexity to a practical level. Numerical simulations are also given to verify the proposed algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a new iterative algorithm for orthogonal frequency division multiplexing (OFDM) joint data detection and phase noise (PHN) cancellation based on minimum mean square prediction error. We particularly highlight the relatively less studied problem of "overfitting" such that the iterative approach may converge to a trivial solution. Specifically, we apply a hard-decision procedure at every iterative step to overcome the overfitting. Moreover, compared with existing algorithms, a more accurate Pade approximation is used to represent the PHN, and finally a more robust and compact fast process based on Givens rotation is proposed to reduce the complexity to a practical level. Numerical Simulations are also given to verify the proposed algorithm. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new parameter-estimation algorithm, which minimises the cross-validated prediction error for linear-in-the-parameter models, is proposed, based on stacked regression and an evolutionary algorithm. It is initially shown that cross-validation is very important for prediction in linear-in-the-parameter models using a criterion called the mean dispersion error (MDE). Stacked regression, which can be regarded as a sophisticated type of cross-validation, is then introduced based on an evolutionary algorithm, to produce a new parameter-estimation algorithm, which preserves the parsimony of a concise model structure that is determined using the forward orthogonal least-squares (OLS) algorithm. The PRESS prediction errors are used for cross-validation, and the sunspot and Canadian lynx time series are used to demonstrate the new algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During April-May 2010 volcanic ash clouds from the Icelandic Eyjafjallajökull volcano reached Europe causing an unprecedented disruption of the EUR/NAT region airspace. Civil aviation authorities banned all flight operations because of the threat posed by volcanic ash to modern turbine aircraft. New quantitative airborne ash mass concentration thresholds, still under discussion, were adopted for discerning regions contaminated by ash. This has implications for ash dispersal models routinely used to forecast the evolution of ash clouds. In this new context, quantitative model validation and assessment of the accuracies of current state-of-the-art models is of paramount importance. The passage of volcanic ash clouds over central Europe, a territory hosting a dense network of meteorological and air quality observatories, generated a quantity of observations unusual for volcanic clouds. From the ground, the cloud was observed by aerosol lidars, lidar ceilometers, sun photometers, other remote-sensing instru- ments and in-situ collectors. From the air, sondes and multiple aircraft measurements also took extremely valuable in-situ and remote-sensing measurements. These measurements constitute an excellent database for model validation. Here we validate the FALL3D ash dispersal model by comparing model results with ground and airplane-based measurements obtained during the initial 14e23 April 2010 Eyjafjallajökull explosive phase. We run the model at high spatial resolution using as input hourly- averaged observed heights of the eruption column and the total grain size distribution reconstructed from field observations. Model results are then compared against remote ground-based and in-situ aircraft-based measurements, including lidar ceilometers from the German Meteorological Service, aerosol lidars and sun photometers from EARLINET and AERONET networks, and flight missions of the German DLR Falcon aircraft. We find good quantitative agreement, with an error similar to the spread in the observations (however depending on the method used to estimate mass eruption rate) for both airborne and ground mass concentration. Such verification results help us understand and constrain the accuracy and reliability of ash transport models and it is of enormous relevance for designing future operational mitigation strategies at Volcanic Ash Advisory Centers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the year 2007 a General Observation Period (GOP) has been performed within the German Priority Program on Quantitative Precipitation Forecasting (PQP). By optimizing the use of existing instrumentation a large data set of in-situ and remote sensing instruments with special focus on water cycle variables was gathered over the full year cycle. The area of interest covered central Europe with increasing focus towards the Black Forest where the Convective and Orographically-induced Precipitation Study (COPS) took place from June to August 2007. Thus the GOP includes a variety of precipitation systems in order to relate the COPS results to a larger spatial scale. For a timely use of the data, forecasts of the numerical weather prediction models COSMO-EU and COSMO-DE of the German Meteorological Service were tailored to match the observations and perform model evaluation in a near real-time environment. The ultimate goal is to identify and distinguish between different kinds of model deficits and to improve process understanding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The past decade has witnessed a sharp increase in published research on energy and buildings. This paper takes stock of work in this area, with a particular focus on construction research and the analysis of non-technical dimensions. While there is widespread recognition as to the importance of non-technical dimensions, research tends to be limited to individualistic studies of occupants and occupant behavior. In contrast, publications in the mainstream social science literature display a broader range of interests, including policy developments, structural constraints on the diffusion and use of new technologies and the construction process itself. The growing interest of more generalist scholars in energy and buildings provides an opportunity for construction research to engage a wider audience. This would enrich the current research agenda, helping to address unanswered problems concerning the relatively weak impact of policy mechanisms and new technologies and the seeming recalcitrance of occupants. It would also help to promote the academic status of construction research as a field. This, in turn, depends on greater engagement with interpretivist types of analysis and theory building, thereby challenging deeply ingrained views on the nature and role of academic research in construction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – This paper examines the role of location-specific (L) advantages in the spatial distribution of multinational enterprise (MNE) R&D activity. The meaning of L advantages is revisited. In addition to L advantages that are industry-specific, the paper emphasises that there is an important category of L advantages, referred to as collocation advantages. Design/methodology/approach – Using the OLI framework, this paper highlights that the innovation activities of MNEs are about interaction of these variables, and the essential process of internalising L advantages to enhance and create firm-specific advantages. Findings – Collocation advantages derive from spatial proximity to specific unaffiliated firms, which may be suppliers, competitors, or customers. It is also argued that L advantages are not always public goods, because they may not be available to all firms at a similar or marginal cost. These costs are associated with access and internalisation of L advantages, and – especially in the case of R&D – are attendant with the complexities of embeddedness. Originality/value – The centralisation/decentralisation, spatial separation/collocation debates in R&D location have been mistakenly viewed as a paradox facing firms, instead of as a trade-off that firms must make.