924 resultados para The Well-Tempered Clavier


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A number of security models have been proposed for RFID systems. Recent studies show that current models tend to be limited in the number of properties they capture. Consequently, models are commonly unable to distinguish between protocols with regard to finer privacy properties. This paper proposes a privacy model that introduces previously unavailable expressions of privacy. Based on the well-studied notion of indistinguishability, the model also strives to be simpler, easier to use, and more intuitive compared to previous models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this chapter we make assumptions about the primary role of education for the life of its beneficiaries and for society. Undoubtedly, formal education plays an important role in enhancing the likelihood for participation in future social life, including enjoyment and employment, by the student as well as the development of the well being of society in general. Similarly, education is often seen as a main means for intergenerational transmission of knowledge and culture. However, as Dewey (1916) argues, in liberal societies, education has the capacity of enhancing democratic participation in society that goes beyond passive participation by its members. One can argue that the achievement of the ideals of democracy demands a free and strong education system. In other words, while education can function as an instrument to integrate students into the present society, it also has the potential to become an instrument for its transformation by means of which citizens can develop an understanding of how their society functions and a sense of agency towards its transformation. Arguably, this is what Freire (1985) meant when he talked about the role of education to “read and write” the world. A stream of progressive educators (e.g., Apple (2004), Freire, (1985), Giroux (2001) and McLaren (2002)) taught us that the reading of the world that is capable of leading into writing the world is a critical reading; i.e., a reading that poses “Why” questions and imagines “What else can be” (Carr & Kemmis, 1987).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existence of the Macroscopic Fundamental Diagram (MFD), which relates network space-mean density and flow, has been shown in urban networks under homogeneous traffic conditions. Since the MFD represents the area-wide network traffic performances, studies on perimeter control strategies and an area traffic state estimation utilizing the MFD concept has been reported. The key requirements for the well-defined MFD is the homogeneity of the area wide traffic condition, which is not universally expected in real world. For the practical application of the MFD concept, several researchers have identified the influencing factors for network homogeneity. However, they did not explicitly take drivers’ behaviour under real time information provision into account, which has a significant impact on the shape of the MFD. This research aims to demonstrate the impact of drivers’ route choice behaviour on network performance by employing the MFD as a measurement. A microscopic simulation is chosen as an experimental platform. By changing the ratio of en-route informed drivers and pre-trip informed drivers as well as by taking different route choice parameters, various scenarios are simulated in order to investigate how drivers’ adaptation to the traffic congestion influences the network performance and the MFD shape. This study confirmed and addressed the impact of information provision on the MFD shape and highlighted the significance of the route choice parameter setting as an influencing factor in the MFD analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The well-known difficulties students exhibit when learning to program are often characterised as either difficulties in understanding the problem to be solved or difficulties in devising and coding a computational solution. It would therefore be helpful to understand which of these gives students the greatest trouble. Unit testing is a mainstay of large-scale software development and maintenance. A unit test suite serves not only for acceptance testing, but is also a form of requirements specification, as exemplified by agile programming methodologies in which the tests are developed before the corresponding program code. In order to better understand students’ conceptual difficulties with programming, we conducted a series of experiments in which students were required to write both unit tests and program code for non-trivial problems. Their code and tests were then assessed separately for correctness and ‘coverage’, respectively. The results allowed us to directly compare students’ abilities to characterise a computational problem, as a unit test suite, and develop a corresponding solution, as executable code. Since understanding a problem is a pre-requisite to solving it, we expected students’ unit testing skills to be a strong predictor of their ability to successfully implement the corresponding program. Instead, however, we found that students’testing abilities lag well behind their coding skills.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The well-known power system stabilizer (PSS) is used to generate supplementary control signals for the excitation system of a generator so as to damp low frequency oscillations in the power system concerned. Up to now, various kinds of PSS design methods have been proposed and some of them applied in actual power systems with different degrees. Given this background, the small-disturbance eigenvalue analysis and large-disturbance dynamic simulations in the time domain are carried out to evaluate the performances of four different PSS design methods, including the Conventional PSS (CPSS), Single-Neuron PSS (SNPSS), Adaptive PSS (APSS) and Multi-band PSS (MBPSS). To make the comparisons equitable, the parameters of the four kinds of PSSs are all determined by the steepest descent method. Finally, an 8-unit 24-bus power system is employed to demonstrate the performances of the four kinds of PSSs by the well-established eigenvalue analysis as well as numerous digital simulations, and some useful conclusions obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Approximate Bayesian computation has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the approximate Bayesian computation parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The Bayesian computation with empirical likelihood algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particulate matter research is essential because of the well known significant adverse effects of aerosol particles on human health and the environment. In particular, identification of the origin or sources of particulate matter emissions is of paramount importance in assisting efforts to control and reduce air pollution in the atmosphere. This thesis aims to: identify the sources of particulate matter; compare pollution conditions at urban, rural and roadside receptor sites; combine information about the sources with meteorological conditions at the sites to locate the emission sources; compare sources based on particle size or mass; and ultimately, provide the basis for control and reduction in particulate matter concentrations in the atmosphere. To achieve these objectives, data was obtained from assorted local and international receptor sites over long sampling periods. The samples were analysed using Ion Beam Analysis and Scanning Mobility Particle Sizer methods to measure the particle mass with chemical composition and the particle size distribution, respectively. Advanced data analysis techniques were employed to derive information from large, complex data sets. Multi-Criteria Decision Making (MCDM), a ranking method, drew on data variability to examine the overall trends, and provided the rank ordering of the sites and years that sampling was conducted. Coupled with the receptor model Positive Matrix Factorisation (PMF), the pollution emission sources were identified and meaningful information pertinent to the prioritisation of control and reduction strategies was obtained. This thesis is presented in the thesis by publication format. It includes four refereed papers which together demonstrate a novel combination of data analysis techniques that enabled particulate matter sources to be identified and sampling site/year ranked. The strength of this source identification process was corroborated when the analysis procedure was expanded to encompass multiple receptor sites. Initially applied to identify the contributing sources at roadside and suburban sites in Brisbane, the technique was subsequently applied to three receptor sites (roadside, urban and rural) located in Hong Kong. The comparable results from these international and national sites over several sampling periods indicated similarities in source contributions between receptor site-types, irrespective of global location and suggested the need to apply these methods to air pollution investigations worldwide. Furthermore, an investigation into particle size distribution data was conducted to deduce the sources of aerosol emissions based on particle size and elemental composition. Considering the adverse effects on human health caused by small-sized particles, knowledge of particle size distribution and their elemental composition provides a different perspective on the pollution problem. This thesis clearly illustrates that the application of an innovative combination of advanced data interpretation methods to identify particulate matter sources and rank sampling sites/years provides the basis for the prioritisation of future air pollution control measures. Moreover, this study contributes significantly to knowledge based on chemical composition of airborne particulate matter in Brisbane, Australia and on the identity and plausible locations of the contributing sources. Such novel source apportionment and ranking procedures are ultimately applicable to environmental investigations worldwide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been much discussion and controversy in the media recently regarding metal toxicity following large head metal on metal (MoM) total hip replacement (THR). Patients have been reported as having hugely elevated levels of metal ions with, at times, devastating systemic, neurolgical and/or orthopaedic sequelae. However, no direct correlation between metal ion level and severity of metallosis has yet been defined. Normative levels of metal ions in well functioning, non Cobalt-Chrome hips have also not been defined to date. The Exeter total hip replacement contains no Cobalt-Chrome (Co-Cr) as it is made entirely from stainless steel. However, small levels of these metals may be present in the modular head of the prosthesis, and their effect on metal ion levels in the well functioning patient has not been investigated. We proposed to define the “normal” levels of metal ions detected by blood test in 20 well functioning patients at a minimum 1 year post primary Exeter total hip replacement, where the patient had had only one joint replaced. Presently, accepted normal levels of blood Chromium are 10–100 nmol/L and plasma Cobalt are 0–20 nmol/L. The UK Modern Humanities Research Association (MHRA) has suggested that levels of either Cobalt or Chromium above 7 ppb (equivalent to 135 nmol/L for Chromium and 120 nmol/L for Cobalt) may be significant. Below this level it is indicated that significant soft tissue reaction and tissue damage is less likely and the risk of implant failure is reduced. Hips were a mixture of cemented and hybrid procedures performed by two experienced orthopaedic consultants. Seventy percent were female, with a mixture of head sizes used. In our cohort, there were no cases where the blood Chromium levels were above the normal range, and in more than 70% of cases, levels were below recordable levels. There were also no cases of elevated plasma Cobalt levels, and in 35% of cases, levels were negligible. We conclude that the implantation with an Exeter total hip replacement does not lead to elevation of blood metal ion levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Density functional theory (DFT) is a powerful approach to electronic structure calculations in extended systems, but suffers currently from inadequate incorporation of long-range dispersion, or Van der Waals (VdW) interactions. VdW-corrected DFT is tested for interactions involving molecular hydrogen, graphite, single-walled carbon nanotubes (SWCNTs), and SWCNT bundles. The energy correction, based on an empirical London dispersion term with a damping function at short range, allows a reasonable physisorption energy and equilibrium distance to be obtained for H2 on a model graphite surface. The VdW-corrected DFT calculation for an (8, 8) nanotube bundle reproduces accurately the experimental lattice constant. For H2 inside or outside an (8, 8) SWCNT, we find the binding energies are respectively higher and lower than that on a graphite surface, correctly predicting the well known curvature effect. We conclude that the VdW correction is a very effective method for implementing DFT calculations, allowing a reliable description of both short-range chemical bonding and long-range dispersive interactions. The method will find powerful applications in areas of SWCNT research where empirical potential functions either have not been developed, or do not capture the necessary range of both dispersion and bonding interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Now in its ninth edition, Australian Tax Analysis: Cases, Commentary, Commercial Applications and Questions has a proven track record as a high-level work for students of taxation law written by a team of authors with many years experience. Taking into account the fact that the volume of material needed to be processed by today’s taxation student can be overwhelming, the well-chosen extracts and thought-provoking commentary in Australian Tax Analysis, 9th edition, provide readers with the depth of knowledge, and reasoning and analytical skills which will be required of them as practitioners. In addition to the carefully selected case extracts and the helpful commentary, each chapter is supplemented by engaging practice questions involving problem solving, commercial decision-making, legal analysis and quantitative application. All these elements combined make Australian Tax Analysis an invaluable aid to the understanding of a subject which can be both technical and complex.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More than 10 years have passed since the High Court of Australia confirmed the recoverability of damages for the cost of raising a child, in the well-known decision in Cattanach v Melchior. Yet a number of aspects of the assessment of such “wrongful birth” damages had not been the subject of a comprehensive court ruling. The recent decision in Waller v James was widely anticipated as potentially providing a comprehensive discussion of the principles relevant to the assessment of damages in wrongful birth cases. However, given a finding on causation adverse to the plaintiffs, the trial judge held that it was unnecessary to determine the quantum of damages. Justice Hislop did, however, make some comments in relation to the assessment of damages. This article focuses mostly on the argued damages issues relating to the costs of raising the child and the trial judge’s comments regarding the same. The Waller v James claim was issued before the enactment of the Health Care Liability Act 2001 (NSW) and the Civil Liability Act 2002 (NSW). Although the case was therefore decided according to the “common law”, as explained below, his Honour’s comments may be of relevance to more recent claims governed by the civil liability legislation in New South Wales, Queensland and South Australia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent association studies in multiple sclerosis (MS) have identified and replicated several single nucleotide polymorphism (SNP) susceptibility loci including CLEC16A, IL2RA, IL7R, RPL5, CD58, CD40 and chromosome 12q13–14 in addition to the well established allele HLA-DR15. There is potential that these genetic susceptibility factors could also modulate MS disease severity, as demonstrated previously for the MS risk allele HLA-DR15. We investigated this hypothesis in a cohort of 1006 well characterised MS patients from South-Eastern Australia. We tested the MS-associated SNPs for association with five measures of disease severity incorporating disability, age of onset, cognition and brain atrophy. We observed trends towards association between the RPL5 risk SNP and time between first demyelinating event and relapse, and between the CD40 risk SNP and symbol digit test score. No associations were significant after correction for multiple testing. We found no evidence for the hypothesis that these new MS disease risk-associated SNPs influence disease severity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interaction between new two-dimensional carbon allotropes, i.e. graphyne (GP) and graphdiyne (GD), and light metal complex hydrides LiAlH4, LiBH4, and NaAlH4 was studied using density functional theory (DFT) incorporating long range van der Waals dispersion correction. The light metal complex hydrides show much stronger interaction with GP and GP than that with fullerene due to the well defined pore structure. Such strong interactions greatly affect the degree of charge donation from the alkali metal atom to AlH4 or BH4, consequently destabilizing the Al-H or B-H bonds. Compared to the isolated light metal complex hydride, the presence of GP or GD can lead to a significant reduction of the hydrogen removal energy. Most interestingly, the hydrogen removal energies for LiBHx on GP and with GD are found to be lowered at all the stages (x from 4 to 1) whereas the H-removal energy in the third stage is increased for LiBH4 on fullerene. In addition, the presence of uniformly distributed pores on GP and GD is expected to facilitate the dehydrogenation of light metal complex hydrides. The present results highlight new interesting materials to catalyze light metal complex hydrides for potential application as media for hydrogen storage. Since GD has been successfully synthesized in a recent experiment, we hope the present work will stimulate further experimental investigations in this direction.