963 resultados para inborn-errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assessment forms an important part of the student learning experience and students place a high value on the quality of feedback that they receive from academic staff on where they might improve on their examinations or assignments. However while feedback is important the quality of the actual assessment itself before students undertake an examination or commence writing an assignment is also important. It is imperative that students are clear in their understanding of what is expected of them in order to achieve a particular grade and that there is lack of ambiguity in examinations or assignments. Biggs (2003) highlighted the importance of clarity in what students are expected to be able to do at the end of a unit of study, and that intended learning outcomes should be clearly aligned to the assessment and communicated to students so that they can structure their learning activities to optimize their assessment performance. However as Rust (2002) highlighted there are often inconsistencies in assessment practices ranging from a mis-match of assessment and learning outcomes to the inclusion of additional learning criteria and lack of clarity in the instructions. Such inconsistencies and unacceptable errors in examination papers can undermine student confidence in the assessment process
In order to try and minimise such inconsistencies an internal assessment group was set up October 2013 within the School of Nursing and Midwifery at Queens University Belfast, consisting of representative academic staff from across the range of undergraduate and post graduate courses in nursing and midwifery. The assessment group was to be a point of reference for all school examinations with a particular remit to develop an assessment strategy for all nursing and midwifery programmes and to ensure that all assessments comply with current best practice and with Nursing and Midwifery Council (NMC) requirements.
Aim
This paper aims to highlight some examples of good practice and common errors that were found in assignments and examinations that were submitted to the assessment group for review.
References

Biggs. J. (2003) Teaching for Quality Learning at University – What the Student Does 2nd Edition SRHE / Open University Press, Buckingham.
Rust, C.( 2002) The impact of assessment on student learning, Active Learning in Higher education Vol3(2):145-158

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE:

To estimate the prevalence of refractive errors in persons 40 years and older.

METHODS:

Counts of persons with phakic eyes with and without spherical equivalent refractive error in the worse eye of +3 diopters (D) or greater, -1 D or less, and -5 D or less were obtained from population-based eye surveys in strata of gender, race/ethnicity, and 5-year age intervals. Pooled age-, gender-, and race/ethnicity-specific rates for each refractive error were applied to the corresponding stratum-specific US, Western European, and Australian populations (years 2000 and projected 2020).

RESULTS:

Six studies provided data from 29 281 persons. In the US, Western European, and Australian year 2000 populations 40 years or older, the estimated crude prevalence for hyperopia of +3 D or greater was 9.9%, 11.6%, and 5.8%, respectively (11.8 million, 21.6 million, and 0.47 million persons). For myopia of -1 D or less, the estimated crude prevalence was 25.4%, 26.6%, and 16.4% (30.4 million, 49.6 million, and 1.3 million persons), respectively, of whom 4.5%, 4.6%, and 2.8% (5.3 million, 8.5 million, and 0.23 million persons), respectively, had myopia of -5 D or less. Projected prevalence rates in 2020 were similar.

CONCLUSIONS:

Refractive errors affect approximately one third of persons 40 years or older in the United States and Western Europe, and one fifth of Australians in this age group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work addresses the joint compensation of IQimbalances and carrier phase synchronization errors of zero- IF receivers. The compensation scheme based on blind-source separation which provides simple yet potent means to jointly compensate for these errors independent of modulation format and constellation size used. The low-complexity of the algorithm makes it a suitable option for real-time deployment as well as practical for integration into monolithic receiver designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we carry out a detailed performance analysis of the blind source separation based I/Q corrector operating at the baseband. Performance of the digital I/Q corrector is evaluated not only under time-varying phase and gain errors but also in the presence of multipath and Rayleigh fading channels. Performance under low-SNR and different modulation formats and constellation sizes is also evaluated. What is more, BER improvement after correction is illustrated. The results indicate that the adaptive algorithm offers adequate performance for most communication applications hence, reducing the matching requirements of the analog front-end enabling higher levels of integration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the benefits of compensating transmitter gain and phase inbalances in the receiver for quadrature communication systems. It is assumed that the gain and phase imbalances are introduced at the transmitter only. A simple non-data aided DSP algorithm is used at the reciever to compensate for the imbalances. Computer simulation has been formed to study a coherent QPSK communication system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This note investigates the adequacy of the finite-sample approximation provided by the Functional Central Limit Theorem (FCLT) when the errors are allowed to be dependent. We compare the distribution of the scaled partial sums of some data with the distribution of the Wiener process to which it converges. Our setup is purposely very simple in that it considers data generated from an ARMA(1,1) process. Yet, this is sufficient to bring out interesting conclusions about the particular elements which cause the approximations to be inadequate in even quite large sample sizes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods, (ii) residual checks reveal no significant departures from the multivariate i.i.d. assumption, and (iii) mean-variance efficiency tests of the market portfolio is not rejected as frequently once it is allowed for the possibility of non-normal errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This note develops general model-free adjustment procedures for the calculation of unbiased volatility loss functions based on practically feasible realized volatility benchmarks. The procedures, which exploit the recent asymptotic distributional results in Barndorff-Nielsen and Shephard (2002a), are both easy to implement and highly accurate in empirically realistic situations. On properly accounting for the measurement errors in the volatility forecast evaluations reported in Andersen, Bollerslev, Diebold and Labys (2003), the adjustments result in markedly higher estimates for the true degree of return-volatility predictability.