11 resultados para evaluation methods

em Dalarna University College Electronic Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Most science centres in Canada employ science-educated floor staff to motivate visitorsto have fun while enhancing the educational reach of the exhibits. Although bright andsensitive to visitors’ needs, floor staff are rarely consulted in the planning,implementation, and modification phases of an exhibit. Instead, many developmentteams rely on costly third-party evaluations or skip the front-end and formativeevaluations all together, leading to costly errors that could have been avoided. This studywill seek to reveal a correlation between floor staff’s perception of visitors’ interactionswith an exhibit and visitors’ actual experiences. If a correlation exists, a recommendationcould be made to encourage planning teams to include floor staff in the formative andsummative evaluations of an exhibit. This is especially relevant to science centres withlimited budgets and for whom a divide exists between floor staff and management.In this study, a formative evaluation of one exhibit was conducted, measuring both floorstaff’s perceptions of the visitor experience and visitors’ own perceptions of the exhibit.Floor staff were then trained on visitor evaluation methods. A week later, floor staff andvisitors were surveyed a second time on a different exhibit to determine whether anincrease in accuracy existed.The training session increased the specificity of the motivation and comprehensionresponses and the enthusiasm of the staff, but not their ability to predict observedbehaviours with respect to ergonomics, learning indicators, holding power, and successrates. The results revealed that although floor staff underestimated visitors’ success ratesat the exhibits, staff accurately predicted visitors’ behaviours with respect to holdingpower, ergonomics, learning indicators, motivation and comprehension, both before andafter the staff training.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis develops and evaluates statistical methods for different types of genetic analyses, including quantitative trait loci (QTL) analysis, genome-wide association study (GWAS), and genomic evaluation. The main contribution of the thesis is to provide novel insights in modeling genetic variance, especially via random effects models. In variance component QTL analysis, a full likelihood model accounting for uncertainty in the identity-by-descent (IBD) matrix was developed. It was found to be able to correctly adjust the bias in genetic variance component estimation and gain power in QTL mapping in terms of precision.  Double hierarchical generalized linear models, and a non-iterative simplified version, were implemented and applied to fit data of an entire genome. These whole genome models were shown to have good performance in both QTL mapping and genomic prediction. A re-analysis of a publicly available GWAS data set identified significant loci in Arabidopsis that control phenotypic variance instead of mean, which validated the idea of variance-controlling genes.  The works in the thesis are accompanied by R packages available online, including a general statistical tool for fitting random effects models (hglm), an efficient generalized ridge regression for high-dimensional data (bigRR), a double-layer mixed model for genomic data analysis (iQTL), a stochastic IBD matrix calculator (MCIBD), a computational interface for QTL mapping (qtl.outbred), and a GWAS analysis tool for mapping variance-controlling loci (vGWAS).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

If a plastic material is used as a print bearer there are a need of a special surface treatment to get agod and durable printing. The most used surface treatment technique for the moment is coronatreatment. This kind of treatment has unfortunately showed not to be so durable in the long term.Plasma treatment which in this case uses different kind of gases in the treatment of polypropyleneis shown as a more effective treatment in this project. When the plasma treated surface has beenprinted is the good quality last much longer and the adhesion between the ink and the surface isremained. To test this adhesion is for the moment a standard used (ASTM D3359). This standardhas appeared unstable and dependent at many different factors, which gives a big variation in thetest results. Because of this has new test methods been carried out to give a more even and morereliable result in the test of the adhesion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IPTV is now offered by several operators in Europe, US and Asia using broadcast video over private IP networks that are isolated from Internet. IPTV services rely ontransmission of live (real-time) video and/or stored video. Video on Demand (VoD)and Time-shifted TV are implemented by IP unicast and Broadcast TV (BTV) and Near video on demand are implemented by IP multicast. IPTV services require QoS guarantees and can tolerate no more than 10-6 packet loss probability, 200 ms delay, and 50 ms jitter. Low delay is essential for satisfactory trick mode performance(pause, resume,fast forward) for VoD, and fast channel change time for BTV. Internet Traffic Engineering (TE) is defined in RFC 3272 and involves both capacity management and traffic management. Capacity management includes capacityplanning, routing control, and resource management. Traffic management includes (1)nodal traffic control functions such as traffic conditioning, queue management, scheduling, and (2) other functions that regulate traffic flow through the network orthat arbitrate access to network resources. An IPTV network architecture includes multiple networks (core network, metronetwork, access network and home network) that connects devices (super head-end, video hub office, video serving office, home gateway, set-top box). Each IP router in the core and metro networks implements some queueing and packet scheduling mechanism at the output link controller. Popular schedulers in IP networks include Priority Queueing (PQ), Class-Based Weighted Fair Queueing (CBWFQ), and Low Latency Queueing (LLQ) which combines PQ and CBWFQ.The thesis analyzes several Packet Scheduling algorithms that can optimize the tradeoff between system capacity and end user performance for the traffic classes. Before in the simulator FIFO,PQ,GPS queueing methods were implemented inside. This thesis aims to implement the LLQ scheduler inside the simulator and to evaluate the performance of these packet schedulers. The simulator is provided by ErnstNordström and Simulator was built in Visual C++ 2008 environmentand tested and analyzed in MatLab 7.0 under windows VISTA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a first step in assessing the potential of thermal energy storage in Swedish buildings, the current situation of the Swedish building stock and different storage methods are discussed in this paper. Overall, many buildings are from the 1960’s or earlier having a relatively high energy demand, creating opportunities for large energy savings. The major means of heating are electricity for detached houses and district heating for multi dwelling houses and premises. Cooling needs are relatively low but steadily increasing, emphasizing the need to consider energy storage for both heat and cold. The thermal mass of a building is important for passive storage of thermal energy but this has not been considered much when constructing buildings in Sweden. Instead, common ways of storing thermal energy in Swedish buildings today is in water storage tanks or in the ground using boreholes, while latent thermal energy storage is still very uncommon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper analyses empirical performance data of five commercial PV-plants in Germany. The purpose was on one side to investigate the weak light performance of the different PV-modules used. On the other hand it was to quantify and compare the shading losses of different PV-array configurations. The importance of this study relies on the fact that even if the behavior under weak light conditions or the shading losses might seem to be a relatively small percentage of the total yearly output; in projects where a performance guarantee is given, these variation can make the difference between meeting or not the conditions.When analyzing the data, a high dispersion was found. To reduce the optical losses and spectral effects, a series of data filters were applied based on the angle of incidence and absolute Air Mass. To compensate for the temperature effects and translate the values to STC (25°C), five different methods were assessed. At the end, the Procedure 2 of IEC 60891 was selected due to its relative simplicity, usage of mostly standard parameters found in datasheets, good accuracy even with missing values, and its potential to improve the results when the complete set of inputs is available.After analyzing the data, the weak light performance of the modules did not show a clear superiority of a certain technology or technology group over the others. Moreover, the uncertainties in the measurements restrictive the conclusiveness of the results.In the partial shading analysis, the landscape mounting of mc-Si PV-modules in free-field showed a significantly better performance than the portrait one. The cross-table string using CIGS modules did not proved the benefits expected and performed actually poorer than a regular one-string-per-table layout. Parallel substrings with CdTe showed a proper functioning and relatively low losses. Among the two product generations of CdTe analyzed, none showed a significantly better performance under partial shadings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To define and evaluate a Computer-Vision (CV) method for scoring Paced Finger-Tapping (PFT) in Parkinson's disease (PD) using quantitative motion analysis of index-fingers and to compare the obtained scores to the UPDRS (Unified Parkinson's Disease Rating Scale) finger-taps (FT). Background: The naked-eye evaluation of PFT in clinical practice results in coarse resolution to determine PD status. Besides, sensor mechanisms for PFT evaluation may cause patients discomfort. In order to avoid cost and effort of applying wearable sensors, a CV system for non-invasive PFT evaluation is introduced. Methods: A database of 221 PFT videos from 6 PD patients was processed. The subjects were instructed to position their hands above their shoulders besides the face and tap the index-finger against the thumb consistently with speed. They were facing towards a pivoted camera during recording. The videos were rated by two clinicians between symptom levels 0-to-3 using UPDRS-FT. The CV method incorporates a motion analyzer and a face detector. The method detects the face of testee in each video-frame. The frame is split into two images from face-rectangle center. Two regions of interest are located in each image to detect index-finger motion of left and right hands respectively. The tracking of opening and closing phases of dominant hand index-finger produces a tapping time-series. This time-series is normalized by the face height. The normalization calibrates the amplitude in tapping signal which is affected by the varying distance between camera and subject (farther the camera, lesser the amplitude). A total of 15 features were classified using K-nearest neighbor (KNN) classifier to characterize the symptoms levels in UPDRS-FT. The target ratings provided by the raters were averaged. Results: A 10-fold cross validation in KNN classified 221 videos between 3 symptom levels with 75% accuracy. An area under the receiver operating characteristic curves of 82.6% supports feasibility of the obtained features to replicate clinical assessments. Conclusions: The system is able to track index-finger motion to estimate tapping symptoms in PD. It has certain advantages compared to other technologies (e.g. magnetic sensors, accelerometers etc.) for PFT evaluation to improve and automate the ratings

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Performance in cross-country skiing is influenced by the skier’s ability to continuously produce propelling forces and force magnitude in relation to the net external forces. A surrogate indicator of the “power supply” in cross-country skiing would be a physiological variable that reflects an important performance-related capability, whereas the body mass itself is an indicator of the “power demand” experienced by the skier. To adequately evaluate an elite skier’s performance capability, it is essential to establish the optimal ratio between the physiological variable and body mass. The overall aim of this doctoral thesis was to investigate the importance of body-mass exponent optimization for the evaluation of performance capability in cross-country skiing. Methods In total, 83 elite cross-country skiers (56 men and 27 women) volunteered to participate in the four studies. The physiological variables of maximal oxygen uptake (V̇O2max) and oxygen uptake corresponding to a blood-lactate concentration of 4 mmol∙l-1 (V̇O2obla) were determined while treadmill roller skiing using the diagonal-stride technique; mean oxygen uptake (V̇O2dp) and upper-body power output (Ẇ) were determined during double-poling tests using a ski-ergometer. Competitive performance data for elite male skiers were collected from two 15-km classical-technique skiing competitions and a 1.25-km sprint prologue; additionally, a 2-km double-poling roller-skiing time trial using the double-poling technique was used as an indicator of upper-body performance capability among elite male and female junior skiers. Power-function modelling was used to explain the race and time-trial speeds based on the physiological variables and body mass. Results The optimal V̇O2max-to-mass ratios to explain 15-km race speed were V̇O2max divided by body mass raised to the 0.48 and 0.53 power, and these models explained 68% and 69% of the variance in mean skiing speed, respectively; moreover, the 95% confidence intervals (CI) for the body-mass exponents did not include either 0 or 1. For the modelling of race speed in the sprint prologue, body mass failed to contribute to the models based on V̇O2max, V̇O2obla, and V̇O2dp. The upper-body power output-to-body mass ratio that optimally explained time-trial speed was Ẇ ∙ m-0.57 and the model explained 63% of the variance in speed. Conclusions The results in this thesis suggest that V̇O2max divided by the square root of body mass should be used as an indicator of performance in 15-km classical-technique races among elite male skiers rather than the absolute or simple ratio-standard scaled expression. To optimally explain an elite male skier’s performance capability in sprint prologues, power-function models based on oxygen-uptake variables expressed absolutely are recommended. Moreover, to evaluate elite junior skiers’ performance capabilities in 2-km double-poling roller-skiing time trials, it is recommended that Ẇ divided by the square root of body mass should be used rather than absolute or simple ratio-standard scaled expression of power output.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining can be used in healthcare industry to “mine” clinical data to discover hidden information for intelligent and affective decision making. Discovery of hidden patterns and relationships often goes intact, yet advanced data mining techniques can be helpful as remedy to this scenario. This thesis mainly deals with Intelligent Prediction of Chronic Renal Disease (IPCRD). Data covers blood, urine test, and external symptoms applied to predict chronic renal disease. Data from the database is initially transformed to Weka (3.6) and Chi-Square method is used for features section. After normalizing data, three classifiers were applied and efficiency of output is evaluated. Mainly, three classifiers are analyzed: Decision Tree, Naïve Bayes, K-Nearest Neighbour algorithm. Results show that each technique has its unique strength in realizing the objectives of the defined mining goals. Efficiency of Decision Tree and KNN was almost same but Naïve Bayes proved a comparative edge over others. Further sensitivity and specificity tests are used as statistical measures to examine the performance of a binary classification. Sensitivity (also called recall rate in some fields) measures the proportion of actual positives which are correctly identified while Specificity measures the proportion of negatives which are correctly identified. CRISP-DM methodology is applied to build the mining models. It consists of six major phases: business understanding, data understanding, data preparation, modeling, evaluation, and deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Annually, 2.8 million neonatal deaths occur worldwide, despite the fact that three-quarters of them could be prevented if available evidence-based interventions were used. Facilitation of community groups has been recognized as a promising method to translate knowledge into practice. In northern Vietnam, the Neonatal Health - Knowledge Into Practice trial evaluated facilitation of community groups (2008-2011) and succeeded in reducing the neonatal mortality rate (adjusted odds ratio, 0.51; 95 % confidence interval 0.30-0.89). The aim of this paper is to report on the process (implementation and mechanism of impact) of this intervention. METHODS: Process data were excerpted from diary information from meetings with facilitators and intervention groups, and from supervisor records of monthly meetings with facilitators. Data were analyzed using descriptive statistics. An evaluation including attributes and skills of facilitators (e.g., group management, communication, and commitment) was performed at the end of the intervention using a six-item instrument. Odds ratios were analyzed, adjusted for cluster randomization using general linear mixed models. RESULTS: To ensure eight active facilitators over 3 years, 11 Women's Union representatives were recruited and trained. Of the 44 intervention groups, composed of health staff and commune stakeholders, 43 completed their activities until the end of the study. In total, 95 % (n = 1508) of the intended monthly meetings with an intervention group and a facilitator were conducted. The overall attendance of intervention group members was 86 %. The groups identified 32 unique problems and implemented 39 unique actions. The identified problems targeted health issues concerning both women and neonates. Actions implemented were mainly communication activities. Communes supported by a group with a facilitator who was rated high on attributes and skills (n = 27) had lower odds of neonatal mortality (odds ratio, 0.37; 95 % confidence interval, 0.19-0.73) than control communes (n = 46). CONCLUSIONS: This evaluation identified several factors that might have influenced the outcomes of the trial: continuity of intervention groups' work, adequate attributes and skills of facilitators, and targeting problems along a continuum of care. Such factors are important to consider in scaling-up efforts.