404 resultados para standard form contract
Resumo:
Total cross sections for neutron scattering from nuclei, with energies ranging from 10 to 600 MeV and from many nuclei spanning the mass range 6Li to 238U, have been analyzed using a simple, three-parameter, functional form. The calculated cross sections are compared with results obtained by using microscopic (g-folding) optical potentials as well as with experimental data. The functional form reproduces those total cross sections very well. When allowance is made for Ramsauer-like effects in the scattering, the parameters of the functional form required vary smoothly with energy and target mass. They too can be represented by functions of energy and mass.
Resumo:
Background: The proportion of older individuals in the driving population is predicted to increase in the next 50 years. This has important implications for driving safety as abilities which are important for safe driving, such as vision (which accounts for the majority of the sensory input required for driving), processing ability and cognition have been shown to decline with age. The current methods employed for screening older drivers upon re-licensure are also vision based. This study, which investigated social, behavioural and professional aspects involved with older drivers, aimed to determine: (i) if the current visual standards in place for testing upon re-licensure are effective in reducing the older driver fatality rate in Australia; (ii) if the recommended visual standards are actually implemented as part of the testing procedures by Australian optometrists; and (iii) if there are other non-standardised tests which may be better at predicting the on-road incident-risk (including near misses and minor incidents) in older drivers than those tests recommended in the standards. Methods: For the first phase of the study, state-based age- and gender-stratified numbers of older driver fatalities for 2000-2003 were obtained from the Australian Transportation Safety Bureau database. Poisson regression analyses of fatality rates were considered by renewal frequency and jurisdiction (as separate models), adjusting for possible confounding variables of age, gender and year. For the second phase, all practising optometrists in Australia were surveyed on the vision tests they conduct in consultations relating to driving and their knowledge of vision requirements for older drivers. Finally, for the third phase of the study to investigate determinants of on-road incident risk, a stratified random sample of 600 Brisbane residents aged 60 years and were selected and invited to participate using an introductory letter explaining the project requirements. In order to capture the number and type of road incidents which occurred for each participant over 12 months (including near misses and minor incidents), an important component of the prospective research study was the development and validation of a driving diary. The diary was a tool in which incidents that occurred could be logged at that time (or very close in time to which they occurred) and thus, in comparison with relying on participant memory over time, recall bias of incident occurrence was minimised. Association between all visual tests, cognition and scores obtained for non-standard functional tests with retrospective and prospective incident occurrence was investigated. Results: In the first phase,rivers aged 60-69 years had a 33% lower fatality risk (Rate Ratio [RR] = 0.75, 95% CI 0.32-1.77) in states with vision testing upon re-licensure compared with states with no vision testing upon re-licensure, however, because the CIs are wide, crossing 1.00, this result should be regarded with caution. However, overall fatality rates and fatality rates for those aged 70 years and older (RR=1.17, CI 0.64-2.13) did not differ between states with and without license renewal procedures, indicating no apparent benefit in vision testing legislation. For the second phase of the study, nearly all optometrists measured visual acuity (VA) as part of a vision assessment for re-licensing, however, 20% of optometrists did not perform any visual field (VF) testing and only 20% routinely performed automated VF on older drivers, despite the standards for licensing advocating automated VF as part of the vision standard. This demonstrates the need for more effective communication between the policy makers and those responsible for carrying out the standards. It may also indicate that the overall higher driver fatality rate in jurisdictions with vision testing requirements is resultant as the tests recommended by the standards are only partially being conducted by optometrists. Hence a standardised protocol for the screening of older drivers for re-licensure across the nation must be established. The opinions of Australian optometrists with regard to the responsibility of reporting older drivers who fail to meet the licensing standards highlighted the conflict between maintaining patient confidentiality or upholding public safety. Mandatory reporting requirements of those drivers who fail to reach the standards necessary for driving would minimise potential conflict between the patient and their practitioner, and help maintain patient trust and goodwill. The final phase of the PhD program investigated the efficacy of vision, functional and cognitive tests to discriminate between at-risk and safe older drivers. Nearly 80% of the participants experienced an incident of some form over the prospective 12 months, with the total incident rate being 4.65/10 000 km. Sixty-three percent reported having a near miss and 28% had a minor incident. The results from the prospective diary study indicate that the current vision screening tests (VA and VF) used for re-licensure do not accurately predict older drivers who are at increased odds of having an on-road incident. However, the variation in visual measurements of the cohort was narrow, also affecting the results seen with the visual functon questionnaires. Hence a larger cohort with greater variability should be considered for a future study. A slightly lower cognitive level (as measured with the Mini-Mental State Examination [MMSE]) did show an association with incident involvement as did slower reaction time (RT), however the Useful-Field-of-View (UFOV) provided the most compelling results of the study. Cut-off values of UFOV processing (>23.3ms), divided attention (>113ms), selective attention (>258ms) and overall score (moderate/ high/ very high risk) were effective in determining older drivers at increased odds of having any on-road incident and the occurrence of minor incidents. Discussion: The results have shown that for the 60-69 year age-group, there is a potential benefit in testing vision upon licence renewal. However, overall fatality rates and fatality rates for those aged 70 years and older indicated no benefit in vision testing legislation and suggests a need for inclusion of screening tests which better predict on-road incidents. Although VA is routinely performed by Australian optometrists on older drivers renewing their licence, VF is not. Therefore there is a need for a protocol to be developed and administered which would result in standardised methods conducted throughout the nation for the screening of older drivers upon re-licensure. Communication between the community, policy makers and those conducting the protocol should be maximised. By implementing a standardised screening protocol which incorporates a level of mandatory reporting by the practitioner, the ethical dilemma of breaching patient confidentiality would also be resolved. The tests which should be included in this screening protocol, however, cannot solely be ones which have been implemented in the past. In this investigation, RT, MMSE and UFOV were shown to be better determinants of on-road incidents in older drivers than VA and VF, however, as previously mentioned, there was a lack of variability in visual status within the cohort. Nevertheless, it is the recommendation from this investigation, that subject to appropriate sensitivity and specificity being demonstrated in the future using a cohort with wider variation in vision, functional performance and cognition, these tests of cognition and information processing should be added to the current protocol for the screening of older drivers which may be conducted at licensing centres across the nation.
Resumo:
Physiological responses to environmental stress are increasingly well studied in scleractinian corals. This work reports a new stress-related skeletal structure we term clypeotheca. Clypeotheca was observed in several livecollected common reef-building coral genera and a two to three kya subfossil specimen from Heron Reef, Great Barrier Reef and consists of an epitheca-like skeletal wall that seals over the surface of parts of the corallum in areas of stress or damage. It appears to form from a coordinated process wherein neighboring polyps and adjoining coenosarc seal themselves off from the surrounding environment as they contract and die. Clypeotheca forms from inward skeletal centripetal growth at the edges of corallites and by the merging of flange-like outgrowths that surround individual spines over the surface of the coenosteum. Microstructurally, the merged flanges are similar to upsidedown dissepiments and true epitheca. Clypeotheca is interpreted primarily as a response to stress that may help protect the colony from invasion of unhealthy tissues by parasites or disease by retracting tissues in areas that have become unhealthy for the polyps. Identification of skeletal responses of corals to environmental stress may enable the frequency of certain types of environmental stress to be documented in past environments. Such data may be important for understanding the nature of reef dynamics through intervals of climate change and for monitoring the effects of possible anthropogenic stress in modern coral reef habitats.
Resumo:
"By understanding how places have evolved, we are better able to guide development and change in the urban fabric and avoid the incongruity created by so much of the modern environment" (MacCormac, R (1996), An anatomy of London, Built Environment, Dec 1996 This paper proposes a theory on the relevance of mapping the evolutionary aspects of historical urban form in order to develop a measure of evaluating architectural elements within urban forms, through to deriving parameters for new buildings. By adopting Conzen's identification of the tripartite division of urban form; the consonance inurban form of a particular palce resides in the elements and measurable values tha makeup the fine grain aggregates of urban form. The paper will demonstrate throughthe case study of Brisbane in Australia, a method of conveying these essential components that constitute a cities continuity of form and active usage. By presenting the past as a repository of urban form characteristics, it is argued that concise architectural responses that stem from such knowledge should result in an engaged urban landscape. The essential proposition is that urban morphology is a missing constituent in the process of urban design, and that the approach of the geographical discipline to the study of urban morphology holds the key to providing the evidence of urban growth characteristics, and this methodology suggests possibilities for an architectural approach that can comprehensively determine qualitative aspects of urban buildings. The relevance of this research lies in a potential to breach the limitations of current urban analysis whilst continuing the evolving currency of urban morphology as an integral practice in the design of our cities.
Resumo:
We consider one-round key exchange protocols secure in the standard model. The security analysis uses the powerful security model of Canetti and Krawczyk and a natural extension of it to the ID-based setting. It is shown how KEMs can be used in a generic way to obtain two different protocol designs with progressively stronger security guarantees. A detailed analysis of the performance of the protocols is included; surprisingly, when instantiated with specific KEM constructions, the resulting protocols are competitive with the best previous schemes that have proofs only in the random oracle model.
Resumo:
We consider one-round key exchange protocols secure in the standard model. The security analysis uses the powerful security model of Canetti and Krawczyk and a natural extension of it to the ID-based setting. It is shown how KEMs can be used in a generic way to obtain two different protocol designs with progressively stronger security guarantees. A detailed analysis of the performance of the protocols is included; surprisingly, when instantiated with specific KEM constructions, the resulting protocols are competitive with the best previous schemes that have proofs only in the random oracle model.
Resumo:
We consider one-round key exchange protocols secure in the standard model. The security analysis uses the powerful security model of Canetti and Krawczyk and a natural extension of it to the ID-based setting. It is shown how KEMs can be used in a generic way to obtain two different protocol designs with progressively stronger security guarantees. A detailed analysis of the performance of the protocols is included; surprisingly, when instantiated with specific KEM constructions, the resulting protocols are competitive with the best previous schemes that have proofs only in the random oracle model.
Resumo:
There is a growing need for international transparency of engineering qualifications, and mechanisms to support and facilitate student mobility. In response, there are a number of global initiatives attempting to address these needs, particularly in Europe, North America and Australia. The Conceive-Design-Implement-Operate (CDIO) Initiative has a set of standards, competencies, and proficiency levels developed through a global community of practice. It is a well-structured framework in which best-practice internationalisation and student mobility can be embedded. However, the current 12 CDIO Standards do not address international qualifications or student mobility. Based on an environmental scan of global activities, the underpinning principles of best practice are identified and form the basis of the proposed 13th CDIO Standard — “Internationalization and Mobility”.
Resumo:
The transformation of China's urban landscape has witnessed a boom in cultural adaptation, namely the adaptation of a Western idea, the creative cluster. This chapter examines the formatting of hundreds of creative clusters-art centres, animation bases, cultural zones, and incubators. The cluster has important implications for how we understand China going forward into the second decade of the 21st century. The cluster phenomenon has resulted in to a substantive remaking of the social contract, between officials, entrepreneurs, local residents, academics-and most significantly cultural producers. However, these processes of adaption are mostly driven by real estate developers working in partnership with local government officials. Cut and paste design is the fast road to completion. In this sense, the description 'creative' may well be redundant.
Resumo:
Matrix function approximation is a current focus of worldwide interest and finds application in a variety of areas of applied mathematics and statistics. In this thesis we focus on the approximation of A^(-α/2)b, where A ∈ ℝ^(n×n) is a large, sparse symmetric positive definite matrix and b ∈ ℝ^n is a vector. In particular, we will focus on matrix function techniques for sampling from Gaussian Markov random fields in applied statistics and the solution of fractional-in-space partial differential equations. Gaussian Markov random fields (GMRFs) are multivariate normal random variables characterised by a sparse precision (inverse covariance) matrix. GMRFs are popular models in computational spatial statistics as the sparse structure can be exploited, typically through the use of the sparse Cholesky decomposition, to construct fast sampling methods. It is well known, however, that for sufficiently large problems, iterative methods for solving linear systems outperform direct methods. Fractional-in-space partial differential equations arise in models of processes undergoing anomalous diffusion. Unfortunately, as the fractional Laplacian is a non-local operator, numerical methods based on the direct discretisation of these equations typically requires the solution of dense linear systems, which is impractical for fine discretisations. In this thesis, novel applications of Krylov subspace approximations to matrix functions for both of these problems are investigated. Matrix functions arise when sampling from a GMRF by noting that the Cholesky decomposition A = LL^T is, essentially, a `square root' of the precision matrix A. Therefore, we can replace the usual sampling method, which forms x = L^(-T)z, with x = A^(-1/2)z, where z is a vector of independent and identically distributed standard normal random variables. Similarly, the matrix transfer technique can be used to build solutions to the fractional Poisson equation of the form ϕn = A^(-α/2)b, where A is the finite difference approximation to the Laplacian. Hence both applications require the approximation of f(A)b, where f(t) = t^(-α/2) and A is sparse. In this thesis we will compare the Lanczos approximation, the shift-and-invert Lanczos approximation, the extended Krylov subspace method, rational approximations and the restarted Lanczos approximation for approximating matrix functions of this form. A number of new and novel results are presented in this thesis. Firstly, we prove the convergence of the matrix transfer technique for the solution of the fractional Poisson equation and we give conditions by which the finite difference discretisation can be replaced by other methods for discretising the Laplacian. We then investigate a number of methods for approximating matrix functions of the form A^(-α/2)b and investigate stopping criteria for these methods. In particular, we derive a new method for restarting the Lanczos approximation to f(A)b. We then apply these techniques to the problem of sampling from a GMRF and construct a full suite of methods for sampling conditioned on linear constraints and approximating the likelihood. Finally, we consider the problem of sampling from a generalised Matern random field, which combines our techniques for solving fractional-in-space partial differential equations with our method for sampling from GMRFs.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
Tagging has become one of the key activities in next generation websites which allow users selecting short labels to annotate, manage, and share multimedia information such as photos, videos and bookmarks. Tagging does not require users any prior training before participating in the annotation activities as they can freely choose any terms which best represent the semantic of contents without worrying about any formal structure or ontology. However, the practice of free-form tagging can lead to several problems, such as synonymy, polysemy and ambiguity, which potentially increase the complexity of managing the tags and retrieving information. To solve these problems, this research aims to construct a lightweight indexing scheme to structure tags by identifying and disambiguating the meaning of terms and construct a knowledge base or dictionary. News has been chosen as the primary domain of application to demonstrate the benefits of using structured tags for managing the rapidly changing and dynamic nature of news information. One of the main outcomes of this work is an automatically constructed vocabulary that defines the meaning of each named entity tag, which can be extracted from a news article (including person, location and organisation), based on experts suggestions from major search engines and the knowledge from public database such as Wikipedia. To demonstrate the potential applications of the vocabulary, we have used it to provide more functionalities in an online news website, including topic-based news reading, intuitive tagging, clipping and sharing of interesting news, as well as news filtering or searching based on named entity tags. The evaluation results on the impact of disambiguating tags have shown that the vocabulary can help to significantly improve news searching performance. The preliminary results from our user study have demonstrated that users can benefit from the additional functionalities on the news websites as they are able to retrieve more relevant news, clip and share news with friends and families effectively.
Resumo:
Abstract—Corneal topography estimation that is based on the Placido disk principle relies on good quality of precorneal tear film and sufficiently wide eyelid (palpebral) aperture to avoid reflections from eyelashes. However, in practice, these conditions are not always fulfilled resulting in missing regions, smaller corneal coverage, and subsequently poorer estimates of corneal topography. Our aim was to enhance the standard operating range of a Placido disk videokeratoscope to obtain reliable corneal topography estimates in patients with poor tear film quality, such as encountered in those diagnosed with dry eye, and with narrower palpebral apertures as in the case of Asian subjects. This was achieved by incorporating in the instrument’s own topography estimation algorithm an image processing technique that comprises a polar-domain adaptive filter and amorphological closing operator. The experimental results from measurements of test surfaces and real corneas showed that the incorporation of the proposed technique results in better estimates of corneal topography, and, in many cases, to a significant increase in the estimated coverage area making such an enhanced videokeratoscope a better tool for clinicians.
Resumo:
The aim of this research is to examine the changing nature of risks that face journalists and media workers in the world's difficult, remote and hostile environments, and consider the 'adequacy' of managing hostile environment safety courses that some media organizations require prior to foreign assignments. The study utilizes several creative works and contributions to this area of analysis, which includes a documentary film production, course contributions, an emergency reference handbook, security and incident management reviews and a template for evacuation and contingency planning. The research acknowledges that employers have a 'duty of care' to personnel working in these environments, identifies the necessity for pre-deployment training and support, and provides a solution for organizations that wish to initiate a comprehensive framework to advise, monitor, protect and respond to incidents. Finally, it explores the possible development of a unique and holistic service to facilitate proactive and responsive support, in the form of a new profession of 'Editorial Logistics Officer' or 'Editorial Safety Officer' within media organizations. This area of research is vitally important to the profession, and the intended contribution is to introduce a simple and cost-efficient framework for media organizations that desire to implement pre-deployment training and field-support – as these programs save lives. The complete proactive and responsive services may be several years from implementation. However, this study demonstrates that the facilitation of Managing Hostile Environment (MHE) courses should be the minimum professional standard. These courses have saved lives in the past and they provide journalists with the tools to "cover the story, and not become the story."
Resumo:
Scalable video coding of H.264/AVC standard enables adaptive and flexible delivery for multiple devices and various network conditions. Only a few works have addressed the influence of different scalability parameters (frame rate, spatial resolution, and SNR) on the user perceived quality within a limited scope. In this paper, we have conducted an experiment of subjective quality assessment for video sequences encoded with H.264/SVC to gain a better understanding of the correlation between video content and UPQ at all scalable layers and the impact of rate-distortion method and different scalabilities on bitrate and UPQ. Findings from this experiment will contribute to a user-centered design of adaptive delivery of scalable video stream.