984 resultados para Digital Cartography Applied to Historical Maps


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To demonstrate that relatively simple third-order theory can provide a framework which shows how peripheral refraction can be manipulated by altering the forms of spectacle lenses. Method: Third-order equations were used to yield lens forms that correct peripheral power errors, either for the lenses alone or in combination with typical peripheral refractions of myopic eyes. These results were compared with those of finite ray-tracing. Results: The approximate forms of spherical and conicoidal lenses provided by third-order theory were flatter over a moderate myopic range than the forms obtained by rigorous raytracing. Lenses designed to correct peripheral refractive errors produced large errors when used with foveal vision and a rotating eye. Correcting astigmatism tended to give large errors in mean oblique error and vice versa. When only spherical lens forms are used, correction of the relative hypermetropic peripheral refractions of myopic eyes which are observed experimentally, or the provision of relative myopic peripheral refractions in such eyes, seems impossible in the majority of cases. Conclusion: The third-order spectacle lens design approach can readily be used to show trends in peripheral refraction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quick detection of abrupt (unknown) parameter changes in an observed hidden Markov model (HMM) is important in several applications. Motivated by the recent application of relative entropy concepts in the robust sequential change detection problem (and the related model selection problem), this paper proposes a sequential unknown change detection algorithm based on a relative entropy based HMM parameter estimator. Our proposed approach is able to overcome the lack of knowledge of post-change parameters, and is illustrated to have similar performance to the popular cumulative sum (CUSUM) algorithm (which requires knowledge of the post-change parameter values) when examined, on both simulated and real data, in a vision-based aircraft manoeuvre detection problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper focuses on the importance of foregrounding an emphasis on the development of historical thinking in the implementation of the Australian Curriculum: History as a way of making the study of history meaningful for their students. In doing so, it argues that teachers need to take up the opportunity to situate the study of Asia as a significant component of the curriculum’s ‘Australia in a world history approach’. In the discussion on the significance of historical thinking, the paper specifically addresses those seven historical concepts articulated in the new history curriculum by drawing from the international scholarship in the field of history education on the ways in which children and adolescents think about historical content and concepts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Authentic assessment tasks enhance engagement, retention and the aspirations of students. This paper explores the discipline-generic features of authentic assessment, which reflect what students need to achieve in the real world. Some assessment tasks are more authentic than others and this paper designs a proposed framework supported by the literature that aids unit co-ordinators to determine the level of authenticity of an assessment task. The framework is applied to three summative assessment tasks, that is, tutorial participation, advocacy exercise and problem-based exam, in a law unit. The level of authenticity of the assessment tasks is compared and opportunities to improve authenticity are identified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There remains a substantial shortfall in treatment of severe skeletal injuries. The current gold standard of autologous bone grafting from the same patient, has many undesirable side effects associated such as donor site morbidity. Tissue engineering seeks to offer a solution to this problem. The primary requirements for tissue engineered scaffolds have already been well established, and many materials, such as polyesters, present themselves as potential candidates for bone defects; they have comparable structural features, but they often lack the required osteoconductivity to promote adequate bone regeneration. By combining these materials with biological growth factors; which promote the infiltration of cells into the scaffold as well as the differentiation into the specific cell and tissue type, it is possible to increase the formation of new bone. However cost and potential complications associated with growth factors means controlled release is an important consideration in the design of new bone tissue engineering strategies. This review will cover recent research in the area of encapsulation and release of growth factors within a variety of different polymeric scaffolds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The state of the practice in safety has advanced rapidly in recent years with the emergence of new tools and processes for improving selection of the most cost-effective safety countermeasures. However, many challenges prevent fair and objective comparisons of countermeasures applied across safety disciplines (e.g. engineering, emergency services, and behavioral measures). These countermeasures operate at different spatial scales, are funded often by different financial sources and agencies, and have associated costs and benefits that are difficult to estimate. This research proposes a methodology by which both behavioral and engineering safety investments are considered and compared in a specific local context. The methodology involves a multi-stage process that enables the analyst to select countermeasures that yield high benefits to costs, are targeted for a particular project, and that may involve costs and benefits that accrue over varying spatial and temporal scales. The methodology is illustrated using a case study from the Geary Boulevard Corridor in San Francisco, California. The case study illustrates that: 1) The methodology enables the identification and assessment of a wide range of safety investment types at the project level; 2) The nature of crash histories lend themselves to the selection of both behavioral and engineering investments, requiring cooperation across agencies; and 3) The results of the cost-benefit analysis are highly sensitive to cost and benefit assumptions, and thus listing and justification of all assumptions is required. It is recommended that a sensitivity analyses be conducted when there is large uncertainty surrounding cost and benefit assumptions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Adolescent idiopathic scoliosis (AIS) is a deformity of the spine, which may 34 require surgical correction by attaching a rod to the patient’s spine using screws 35 implanted in the vertebral bodies. Surgeons achieve an intra-operative reduction in the 36 deformity by applying compressive forces across the intervertebral disc spaces while 37 they secure the rod to the vertebra. We were interested to understand how the 38 deformity correction is influenced by increasing magnitudes of surgical corrective forces 39 and what tissue level stresses are predicted at the vertebral endplates due to the 40 surgical correction. 41 Methods: Patient-specific finite element models of the osseoligamentous spine and 42 ribcage of eight AIS patients who underwent single rod anterior scoliosis surgery were 43 created using pre-operative computed tomography (CT) scans. The surgically altered 44 spine, including titanium rod and vertebral screws, was simulated. The models were 45 analysed using data for intra-operatively measured compressive forces – three load 46 profiles representing the mean and upper and lower standard deviation of this data 47 were analysed. Data for the clinically observed deformity correction (Cobb angle) were 48 compared with the model-predicted correction and the model results investigated to 49 better understand the influence of increased compressive forces on the biomechanics of 50 the instrumented joints. 51 Results: The predicted corrected Cobb angle for seven of the eight FE models were 52 within the 5° clinical Cobb measurement variability for at least one of the force profiles. 53 The largest portion of overall correction was predicted at or near the apical 54 intervertebral disc for all load profiles. Model predictions for four of the eight patients 55 showed endplate-to-endplate contact was occurring on adjacent endplates of one or 56 more intervertebral disc spaces in the instrumented curve following the surgical loading 57 steps. 58 Conclusion: This study demonstrated there is a direct relationship between intra-59 operative joint compressive forces and the degree of deformity correction achieved. The 60 majority of the deformity correction will occur at or in adjacent spinal levels to the apex 61 of the deformity. This study highlighted the importance of the intervertebral disc space 62 anatomy in governing the coronal plane deformity correction and the limit of this 63 correction will be when bone-to-bone contact of the opposing vertebral endplates 64 occurs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Australia should take the opportunity to clearly permit transformative uses of existing material, without requiring consideration of all four fairness factors. The key test should be the effect on the core licensing market of existing copyright expression. To accomplish this, a new transformative use exception should be introduced into Australian law. Alternatively, it should be presumptively fair to make a transformative use of existing material, regardless of commercial purpose, the character of the plaintiff's work, and amount and substantiality of the portion used, if the transformative work does not displace the market for the existing material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliability of the performance of biometric identity verification systems remains a significant challenge. Individual biometric samples of the same person (identity class) are not identical at each presentation and performance degradation arises from intra-class variability and inter-class similarity. These limitations lead to false accepts and false rejects that are dependent. It is therefore difficult to reduce the rate of one type of error without increasing the other. The focus of this dissertation is to investigate a method based on classifier fusion techniques to better control the trade-off between the verification errors using text-dependent speaker verification as the test platform. A sequential classifier fusion architecture that integrates multi-instance and multisample fusion schemes is proposed. This fusion method enables a controlled trade-off between false alarms and false rejects. For statistically independent classifier decisions, analytical expressions for each type of verification error are derived using base classifier performances. As this assumption may not be always valid, these expressions are modified to incorporate the correlation between statistically dependent decisions from clients and impostors. The architecture is empirically evaluated by applying the proposed architecture for text dependent speaker verification using the Hidden Markov Model based digit dependent speaker models in each stage with multiple attempts for each digit utterance. The trade-off between the verification errors is controlled using the parameters, number of decision stages (instances) and the number of attempts at each decision stage (samples), fine-tuned on evaluation/tune set. The statistical validation of the derived expressions for error estimates is evaluated on test data. The performance of the sequential method is further demonstrated to depend on the order of the combination of digits (instances) and the nature of repetitive attempts (samples). The false rejection and false acceptance rates for proposed fusion are estimated using the base classifier performances, the variance in correlation between classifier decisions and the sequence of classifiers with favourable dependence selected using the 'Sequential Error Ratio' criteria. The error rates are better estimated by incorporating user-dependent (such as speaker-dependent thresholds and speaker-specific digit combinations) and class-dependent (such as clientimpostor dependent favourable combinations and class-error based threshold estimation) information. The proposed architecture is desirable in most of the speaker verification applications such as remote authentication, telephone and internet shopping applications. The tuning of parameters - the number of instances and samples - serve both the security and user convenience requirements of speaker-specific verification. The architecture investigated here is applicable to verification using other biometric modalities such as handwriting, fingerprints and key strokes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last fifteen years digital storytelling has come to stand for considerably more than a specific form of collaborative media production. It is also an international network of new media artists, creative practitioners, curators, scholars, and facilitating community media organisations. In May this year the movement will converge on Ankara, Turkey for its Fifth International Conference and Exhibition. The event will draw together key adopters, adapters and innovators in community-based methods of collaborative media production from around the world. Researchers from the Queensland University of Technology will lead a delegation that will include key players in the Australian digital storytelling movement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the natural problem of secure n-party computation (in the computationally unbounded attack model) of circuits over an arbitrary finite non-Abelian group (G,⋅), which we call G-circuits. Besides its intrinsic interest, this problem is also motivating by a completeness result of Barrington, stating that such protocols can be applied for general secure computation of arbitrary functions. For flexibility, we are interested in protocols which only require black-box access to the group G (i.e. the only computations performed by players in the protocol are a group operation, a group inverse, or sampling a uniformly random group element). Our investigations focus on the passive adversarial model, where up to t of the n participating parties are corrupted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In response to the rail industry lacking a consistently accepted standard of minimal training to perform incident investigations, the Australasian rail industry requested the development of a unified approach to investigator training. This paper details how the findings from a training needs analysis were applied to inform the development of a standardised training package for rail incident investigators. Data from job descriptions, training documents and subject matter experts sourced from 17 Australasian organisations were analysed and refined to yield a draft set of 10 critical competencies. Finally the draft of critical competencies was reviewed by industry experts to verify the accuracy and completeness of the competency list and to consider the most appropriate level of qualification for training development. The competencies identified and the processes described to translate research into an applied training framework in this paper, can be generalised to assist practitioners and researchers in developing industry approved standardised training packages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cryo-electron tomography together with averaging of sub-tomograms containing identical particles can reveal the structure of proteins or protein complexes in their native environment. The resolution of this technique is limited by the contrast transfer function (CTF) of the microscope. The CTF is not routinely corrected in cryo-electron tomography because of difficulties including CTF detection, due to the low signal to noise ratio, and CTF correction, since images are characterised by a spatially variant CTF. Here we simulate the effects of the CTF on the resolution of the final reconstruction, before and after CTF correction, and consider the effect of errors and approximations in defocus determination. We show that errors in defocus determination are well tolerated when correcting a series of tomograms collected at a range of defocus values. We apply methods for determining the CTF parameters in low signal to noise images of tilted specimens, for monitoring defocus changes using observed magnification changes, and for correcting the CTF prior to reconstruction. Using bacteriophage PRDI as a test sample, we demonstrate that this approach gives an improvement in the structure obtained by sub-tomogram averaging from cryo-electron tomograms.