924 resultados para hex meshing schemes
Resumo:
Electrical resistivity of soils and sediments is strongly influenced by the presence of interstitial water. Taking advantage of this dependency, electrical-resistivity imaging (ERI) can be effectively utilized to estimate subsurface soil-moisture distributions. The ability to obtain spatially extensive data combined with time-lapse measurements provides further opportunities to understand links between land use and climate processes. In natural settings, spatial and temporal changes in temperature and porewater salinity influence the relationship between soil moisture and electrical resistivity. Apart from environmental factors, technical, theoretical, and methodological ambiguities may also interfere with accurate estimation of soil moisture from ERI data. We have examined several of these complicating factors using data from a two-year study at a forest-grassland ecotone, a boundary between neighboring but different plant communities.At this site, temperature variability accounts for approximately 20-45 of resistivity changes from cold winter to warm summer months. Temporal changes in groundwater conductivity (mean=650 S/cm =57.7) and a roughly 100-S/cm spatial difference between the forest and grassland had only a minor influence on the moisture estimates. Significant seasonal fluctuations in temperature and precipitation had negligible influence on the basic measurement errors in data sets. Extracting accurate temporal changes from ERI can be hindered by nonuniqueness of the inversion process and uncertainties related to time-lapse inversion schemes. The accuracy of soil moisture obtained from ERI depends on all of these factors, in addition to empirical parameters that define the petrophysical soil-moisture/resistivity relationship. Many of the complicating factors and modifying variables to accurately quantify soil moisture changes with ERI can be accounted for using field and theoretical principles.
Resumo:
Background Pharmacist prescribing has been introduced in several countries and is a possible future role for pharmacy in Australia. Objective To assess whether patient satisfaction with the pharmacist as a prescriber, and patient experiences in two settings of collaborative doctor-pharmacist prescribing may be barriers to implementation of pharmacist prescribing. Design Surveys containing closed questions, and Likert scale responses, were completed in both settings to investigate patient satisfaction after each consultation. A further survey investigating attitudes towards pharmacist prescribing, after multiple consultations, was completed in the sexual health clinic. Setting and Participants A surgical pre-admission clinic (PAC) in a tertiary hospital and an outpatient sexual health clinic at a university hospital. Two hundred patients scheduled for elective surgery, and 17 patients diagnosed with HIV infection, respectively, recruited to the pharmacist prescribing arm of two collaborative doctor-pharmacist prescribing studies. Results Consultation satisfaction response rates in PAC and the sexual health clinic were 182/200 (91%) and 29/34 (85%), respectively. In the sexual health clinic, the attitudes towards pharmacist prescribing survey response rate were 14/17 (82%). Consultation satisfaction was high in both studies, most patients (98% and 97%, respectively) agreed they were satisfied with the consultation. In the sexual health clinic, all patients (14/14) agreed that they trusted the pharmacist’s ability to prescribe, care was as good as usual care, and they would recommend seeing a pharmacist prescriber to friends. Discussion and Conclusion Most of the patients had a high satisfaction with pharmacist prescriber consultations, and a positive outlook on the collaborative model of care in the sexual health clinic.
Resumo:
Enrichment of marine organics in remote marine aerosols can influence their ability to act as cloud condensation nuclei (CCN), which are sites for water vapour to condense into cloud droplets. This project identified the composition and hygroscopicity of sea spray aerosol (SSA) formed at the ocean surface due to bursting of entrained air bubbles. SSA from organically enriched waters in the southwest Pacific and Southern Oceans were investigated. Results indicate that current emission schemes may not adequately predict SSA CCN, influencing the representation of cloud formation in climate modelling.
Resumo:
Cryptographic hash functions are an important tool of cryptography and play a fundamental role in efficient and secure information processing. A hash function processes an arbitrary finite length input message to a fixed length output referred to as the hash value. As a security requirement, a hash value should not serve as an image for two distinct input messages and it should be difficult to find the input message from a given hash value. Secure hash functions serve data integrity, non-repudiation and authenticity of the source in conjunction with the digital signature schemes. Keyed hash functions, also called message authentication codes (MACs) serve data integrity and data origin authentication in the secret key setting. The building blocks of hash functions can be designed using block ciphers, modular arithmetic or from scratch. The design principles of the popular Merkle–Damgård construction are followed in almost all widely used standard hash functions such as MD5 and SHA-1.
Resumo:
In this thesis various schemes using custom power devices for power quality improvement in low voltage distribution network are studied. Customer operated distributed generators makes a typical network non-radial and affect the power quality. A scheme considering different algorithm of DSTATCOM is proposed for power circulation and islanded operation of the system. To compensate reactive power overflow and facilitate unity power factor, a UPQC is introduced. Stochastic analysis is carried out for different scenarios to get a comprehensive idea about a real life distribution network. Combined operation of static compensator and voltage regulator is tested for the optimum quality and stability of the system.
Resumo:
The preparation of macroporous methacrylate monolithic material with controlled pore structures can be carried out in an unstirred mould through careful and precise control of the polymerisation kinetics and parameters. Contemporary synthesis conditions of methacrylate monolithic polymers are based on existing polymerisation schemes without an in-depth understanding of the dynamics of pore structure and formation. This leads to poor performance in polymer usage thereby affecting final product recovery and purity, retention time, productivity and process economics. The unique porosity of methacrylate monolithic polymer which propels its usage in many industrial applications can be controlled easily during its preparation. Control of the kinetics of the overall process through changes in reaction time, temperature and overall composition such as cross-linker and initiator contents allow the fine tuning of the macroporous structure and provide an understanding of the mechanism of pore formation within the unstirred mould. The significant effect of temperature of the reaction kinetics serves as an effectual means to control and optimise the pore structure and allows the preparation of polymers with different pore size distributions from the same composition of the polymerisation mixture. Increasing the concentration of the cross-linking monomer affects the composition of the final monoliths and also decreases the average pore size as a result of pre-mature formation of highly cross-linked globules with a reduced propensity to coalesce. The choice and concentration of porogen solvent is also imperative. Different porogens and porogen mixtures present different pore structure output. Example, larger pores are obtained in a poor solvent due to early phase separation.
Resumo:
At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).
Resumo:
We analyse the security of iterated hash functions that compute an input dependent checksum which is processed as part of the hash computation. We show that a large class of such schemes, including those using non-linear or even one-way checksum functions, is not secure against the second preimage attack of Kelsey and Schneier, the herding attack of Kelsey and Kohno and the multicollision attack of Joux. Our attacks also apply to a large class of cascaded hash functions. Our second preimage attacks on the cascaded hash functions improve the results of Joux presented at Crypto’04. We also apply our attacks to the MD2 and GOST hash functions. Our second preimage attacks on the MD2 and GOST hash functions improve the previous best known short-cut second preimage attacks on these hash functions by factors of at least 226 and 254, respectively. Our herding and multicollision attacks on the hash functions based on generic checksum functions (e.g., one-way) are a special case of the attacks on the cascaded iterated hash functions previously analysed by Dunkelman and Preneel and are not better than their attacks. On hash functions with easily invertible checksums, our multicollision and herding attacks (if the hash value is short as in MD2) are more efficient than those of Dunkelman and Preneel.
Resumo:
Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-then-sign signature scheme, one has to solve a cryptanalytical task which is related to finding second preimages for the hash function. In this article, we will show how to use Dean’s method of finding expandable messages for finding a second preimage in the Merkle-Damgård hash function to existentially forge a signature scheme based on a t-bit RMX-hash function which uses the Davies-Meyer compression functions (e.g., MD4, MD5, SHA family) in 2 t/2 chosen messages plus 2 t/2 + 1 off-line operations of the compression function and similar amount of memory. This forgery attack also works on the signature schemes that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack.
Resumo:
The forthcoming NIST’s Advanced Hash Standard (AHS) competition to select SHA-3 hash function requires that each candidate hash function submission must have at least one construction to support FIPS 198 HMAC application. As part of its evaluation, NIST is aiming to select either a candidate hash function which is more resistant to known side channel attacks (SCA) when plugged into HMAC, or that has an alternative MAC mode which is more resistant to known SCA than the other submitted alternatives. In response to this, we perform differential power analysis (DPA) on the possible smart card implementations of some of the recently proposed MAC alternatives to NMAC (a fully analyzed variant of HMAC) and HMAC algorithms and NMAC/HMAC versions of some recently proposed hash and compression function modes. We show that the recently proposed BNMAC and KMDP MAC schemes are even weaker than NMAC/HMAC against the DPA attacks, whereas multi-lane NMAC, EMD MAC and the keyed wide-pipe hash have similar security to NMAC against the DPA attacks. Our DPA attacks do not work on the NMAC setting of MDC-2, Grindahl and MAME compression functions.
Resumo:
Throughout the world, there is increasing pressure on governments, companies,regulators and standard-setters to respond to the global challenge of climate change. The growing number of regulatory requirements for organisations to disclose their greenhouse gas (GHG) emissions and emergent national, regional and international emissions trading schemes (ETSs) reflect key government responses to this challenge. Assurance of GHG emissions disclosures enhances the credibility of these disclosures and any associated trading schemes. The auditing and assurance profession has an important role to play in the provision of such assurance, highlighted by the International Auditing and Assurance Standards Board’s (IAASB) decision to develop an international GHG emissions assurance standard. This article sets out the developments to date on an international standard for the assurance of GHG emissions disclosures. It then provides information on the way Australian companies have responded to the challenge of GHG reporting and assurance. Finally, it outlines the types of assurance that assurance providers in Australia are currently providing in this area.
Resumo:
The Office of Urban Management recognises that the values which characterise the SEQ region as 'subtropical' are important determinants of form in urban and regional planning. Subtropical values are those qualities on which our regional identity depends. A built environment which responds positively to these values is a critical ingredient for achieving a desirable future for the region. The Centre for Subtropical Design has undertaken this study to identify the particular set of values which characterises SEQ, and to translate theses values into design principals that will maintain and reinforce the value set. The principles not only apply to the overall balance between the natural environment and the built environment, but can be applied by local government authorities to guide local planning schemes and help shape specific built for outcomes.
Resumo:
Worldwide public concern over climate change and the need to limit greenhouse gas (hereafter, GHG) emissions has increasingly motivated public officials to consider more stringent environmental regulation and standards. The authors argue that the development of a new international assurance standard on GHG disclosures is an appropriate response by the auditing and assurance profession to meet these challenges. At its December 2007 meeting, the International Auditing and Assurance Standards Board (hereafter, IAASB) approved a project to consider the development of such a standard aimed at promoting trust and confidence in disclosures of GHG emissions, including disclosures required under emissions trading schemes. The authors assess the types of disclosures that can be assured, and outline the issues involved in developing an international assurance standard on GHG emissions disclosures. The discussion synthesizes the insights gained from four international roundtables on the proposed IAASB assurance standard held in Asia-Pacific, North America, and Europe during 2008, and an IAASB meeting addressing this topic in December 2008.
Resumo:
This paper aims to develop a meshless approach based on the Point Interpolation Method (PIM) for numerical simulation of a space fractional diffusion equation. Two fully-discrete schemes for the one-dimensional space fractional diffusion equation are obtained by using the PIM and the strong-forms of the space diffusion equation. Numerical examples with different nodal distributions are studied to validate and investigate the accuracy and efficiency of the newly developed meshless approach.
Resumo:
Long-term systematic population monitoring data sets are rare but are essential in identifying changes in species abundance. In contrast, community groups and natural history organizations have collected many species lists. These represent a large, untapped source of information on changes in abundance but are generally considered of little value. The major problem with using species lists to detect population changes is that the amount of effort used to obtain the list is often uncontrolled and usually unknown. It has been suggested that using the number of species on the list, the "list length," can be a measure of effort. This paper significantly extends the utility of Franklin's approach using Bayesian logistic regression. We demonstrate the value of List Length Analysis to model changes in species prevalence (i.e., the proportion of lists on which the species occurs) using bird lists collected by a local bird club over 40 years around Brisbane, southeast Queensland, Australia. We estimate the magnitude and certainty of change for 269 bird species and calculate the probabilities that there have been declines and increases of given magnitudes. List Length Analysis confirmed suspected species declines and increases. This method is an important complement to systematically designed intensive monitoring schemes and provides a means of utilizing data that may otherwise be deemed useless. The results of List Length Analysis can be used for targeting species of conservation concern for listing purposes or for more intensive monitoring. While Bayesian methods are not essential for List Length Analysis, they can offer more flexibility in interrogating the data and are able to provide a range of parameters that are easy to interpret and can facilitate conservation listing and prioritization. © 2010 by the Ecological Society of America.