956 resultados para pacs: security aspects of it


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fouling of industrial surfaces by silica and calcium oxalate can be detrimental to a number of process streams. Solution chemistry plays a large roll in the rate and type of scale formed on industrial surfaces. This study is on the kinetics and thermodynamics of SiO2 and calcium oxalate composite formation in solutions containing Mg2+ ions, trans-aconitic acid and sucrose, to mimic factory sugar cane juices. The induction time (ti) of silicic acid polymerization is found to be dependent on the sucrose concentration and SiO2 supersaturation ratio (SS). Generalized kinetic and solubility models are developed for SiO2 and calcium oxalate in binary systems using response surface methodology. The role of sucrose, Mg, trans-aconitic acid, a mixture of Mg and trans-aconitic acid, SiO2 SS ratio and Ca in the formation of com- posites is explained using the solution properties of these species including their ability to form complexes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rakaposhi is a synchronous stream cipher, which uses three main components: a non-linear feedback shift register (NLFSR), a dynamic linear feedback shift register (DLFSR) and a non-linear filtering function (NLF). NLFSR consists of 128 bits and is initialised by the secret key K. DLFSR holds 192 bits and is initialised by an initial vector (IV). NLF takes 8-bit inputs and returns a single output bit. The work identifies weaknesses and properties of the cipher. The main observation is that the initialisation procedure has the so-called sliding property. The property can be used to launch distinguishing and key recovery attacks. The distinguisher needs four observations of the related (K,IV) pairs. The key recovery algorithm allows to discover the secret key K after observing 29 pairs of (K,IV). Based on the proposed related-key attack, the number of related (K,IV) pairs is 2(128 + 192)/4 pairs. Further the cipher is studied when the registers enter short cycles. When NLFSR is set to all ones, then the cipher degenerates to a linear feedback shift register with a non-linear filter. Consequently, the initial state (and Secret Key and IV) can be recovered with complexity 263.87. If DLFSR is set to all zeros, then NLF reduces to a low non-linearity filter function. As the result, the cipher is insecure allowing the adversary to distinguish it from a random cipher after 217 observations of keystream bits. There is also the key recovery algorithm that allows to find the secret key with complexity 2 54.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper makes a formal security analysis of the current Australian e-passport implementation using model checking tools CASPER/CSP/FDR. We highlight security issues in the current implementation and identify new threats when an e-passport system is integrated with an automated processing system like SmartGate. The paper also provides a security analysis of the European Union (EU) proposal for Extended Access Control (EAC) that is intended to provide improved security in protecting biometric information of the e-passport bearer. The current e-passport specification fails to provide a list of adequate security goals that could be used for security evaluation. We fill this gap; we present a collection of security goals for evaluation of e-passport protocols. Our analysis confirms existing security weaknesses that were previously identified and shows that both the Australian e-passport implementation and the EU proposal fail to address many security and privacy aspects that are paramount in implementing a secure border control mechanism. ACM Classification C.2.2 (Communication/Networking and Information Technology – Network Protocols – Model Checking), D.2.4 (Software Engineering – Software/Program Verification – Formal Methods), D.4.6 (Operating Systems – Security and Privacy Protection – Authentication)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Those in organisations tend to adopt new technologies as a way to improve their functions, reduce cost and attain best practices. Thus, technology promoters (or vendors) work along those lines in order to convince adopters to invest in those technologies and develop their own organisations profit in return. The possible resultant ‘conflicts of interest’ makes the study of reasons behind IT diffusion and adoption an interesting subject. In this paper we look at IT diffusion and adoption in terms of technology (system features), organisational aspects (firm level characteristics) and inter-organisational aspects (market dynamics) in order to see who might be the real beneficiaries of technology adoption. We use ERP packages as an example of an innovation that has been widely diffused and adopted for the last 10 years. We believe that our findings can be useful to those adopting ERP packages as it gives them a wider view of the situation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Occupational standards concerning the allowable concentrations of chemical compounds in the ambient air of workplaces have been established in several countries at national levels. With the integration of the European Union, a need exists for establishing harmonized Occupational Exposure Limits. For analytical developments, it is apparent that methods for speciation or fractionation of carcinogenic metal compounds will be of increasing practical importance for standard setting. Criteria of applicability under field conditions, cost-effectiveness, and robustness are practical driving forces for new developments. When the European Union issued a list of 62 chemical substances with Occupational Exposure Limits in 2000, 25 substances received a 'skin' notation. The latter indicates that toxicologically significant amounts may be taken up via the skin. Similar notations exist on national levels. For such substances, monitoring concentrations in ambient air will not be sufficient; biological monitoring strategies will gain further importance in the medical surveillance of workers who are exposed to such compounds. Proceedings in establishing legal frameworks for a biological monitoring of chemical exposures within Europe are paralleled by scientific advances in this field. A new aspect is the possibility of a differential adduct monitoring, using blood proteins of different half-life or lifespan. This technique allows differentiation between long-term mean exposure to reactive chemicals and short-term episodes, for example, by accidental overexposure. For further analytical developments, the following issues have been addressed as being particularly important: New dose monitoring strategies, sensitive and reliable methods for detection of DNA adducts, cytogenetic parameters in biological monitoring, methods to monitor exposure to sensitizing chemicals, and parameters for individual susceptibilities to chemical toxicants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Infectious diseases such as SARS, influenza and bird flu have the potential to cause global pandemics; a key intervention will be vaccination. Hence, it is imperative to have in place the capacity to create vaccines against new diseases in the shortest time possible. In 2004, The Institute of Medicine asserted that the world is tottering on the verge of a colossal influenza outbreak. The institute stated that, inadequate production system for influenza vaccines is a major obstruction in the preparation towards influenza outbreaks. Because of production issues, the vaccine industry is facing financial and technological bottlenecks: In October 2004, the FDA was caught off guard by the shortage of flu vaccine, caused by a contamination in a US-based plant (Chiron Corporation), one of the only two suppliers of US flu vaccine. Due to difficulties in production and long processing times, the bulk of the world's vaccine production comes from very small number of companies compared to the number of companies producing drugs. Conventional vaccines are made of attenuated or modified forms of viruses. Relatively high and continuous doses are administered when a non-viable vaccine is used and the overall protective immunity obtained is ephemeral. The safety concerns of viral vaccines have propelled interest in creating a viable replacement that would be more effective and safer to use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protection of passwords used to authenticate computer systems and networks is one of the most important application of cryptographic hash functions. Due to the application of precomputed memory look up attacks such as birthday and dictionary attacks on the hash values of passwords to find passwords, it is usually recommended to apply hash function to the combination of both the salt and password, denoted salt||password, to prevent these attacks. In this paper, we present the first security analysis of salt||password hashing application. We show that when hash functions based on the compression functions with easily found fixed points are used to compute the salt||password hashes, these hashes are susceptible to precomputed offline birthday attacks. For example, this attack is applicable to the salt||password hashes computed using the standard hash functions such as MD5, SHA-1, SHA-256 and SHA-512 that are based on the popular Davies-Meyer compression function. This attack exposes a subtle property of this application that although the provision of salt prevents an attacker from finding passwords, salts prefixed to the passwords do not prevent an attacker from doing a precomputed birthday attack to forge an unknown password. In this forgery attack, we demonstrate the possibility of building multiple passwords for an unknown password for the same hash value and salt. Interestingly, password||salt (i.e. salts suffixed to the passwords) hashes computed using Davies-Meyer hash functions are not susceptible to this attack, showing the first security gap between the prefix-salt and suffix-salt methods of hashing passwords.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Provision of network infrastructure to meet rising network peak demand is increasing the cost of electricity. Addressing this demand is a major imperative for Australian electricity agencies. The network peak demand model reported in this paper provides a quantified decision support tool and a means of understanding the key influences and impacts on network peak demand. An investigation of the system factors impacting residential consumers’ peak demand for electricity was undertaken in Queensland, Australia. Technical factors, such as the customers’ location, housing construction and appliances, were combined with social factors, such as household demographics, culture, trust and knowledge, and Change Management Options (CMOs) such as tariffs, price,managed supply, etc., in a conceptual ‘map’ of the system. A Bayesian network was used to quantify the model and provide insights into the major influential factors and their interactions. The model was also used to examine the reduction in network peak demand with different market-based and government interventions in various customer locations of interest and investigate the relative importance of instituting programs that build trust and knowledge through well designed customer-industry engagement activities. The Bayesian network was implemented via a spreadsheet with a tick box interface. The model combined available data from industry-specific and public sources with relevant expert opinion. The results revealed that the most effective intervention strategies involve combining particular CMOs with associated education and engagement activities. The model demonstrated the importance of designing interventions that take into account the interactions of the various elements of the socio-technical system. The options that provided the greatest impact on peak demand were Off-Peak Tariffs and Managed Supply and increases in the price of electricity. The impact in peak demand reduction differed for each of the locations and highlighted that household numbers, demographics as well as the different climates were significant factors. It presented possible network peak demand reductions which would delay any upgrade of networks, resulting in savings for Queensland utilities and ultimately for households. The use of this systems approach using Bayesian networks to assist the management of peak demand in different modelled locations in Queensland provided insights about the most important elements in the system and the intervention strategies that could be tailored to the targeted customer segments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While twin studies have previously demonstrated high heritability of susceptibility to ankylosing spondylitis (AS), it is only recently that the involvement of genetic factors in determining the severity of the disease has been demonstrated. The genes involved in determining the rate of ankylosis in AS are likely to be different from those involved in the underlying immunologic events, and represent important potential targets for treatment of AS. This article will describe the progress that has been made in the genetic epidemiology of AS, and in identifying the genes involved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Streptococcus pyogenes (group A streptococcus) is an important human pathogen, causing a wide array of infections ranging in severity. The majority of S. pyogenes infections are mild upper respiratory tract or skin infections. Severe, invasive infections, such as bacteraemia, are relatively rare, but constitute a major global burden with a high mortality. Certain streptococcal types are associated with a more severe disease and higher mortality. Bacterial, non-necrotizing cellulitis and erysipelas are localised infections of the skin, and although they are usually not life-threatening, they have a tendency to recur and therefore cause substantial morbidity. Despite several efforts aimed at developing an effective and safe vaccine against S. pyogenes infections, no vaccine is yet available. In this study, the epidemiology of invasive S. pyogenes infections in Finland was described over a decade of national, population-based surveillance. Recent trends in incidence, outcome and bacterial types were investigated. The beta-haemolytic streptococci causing cellulitis and erysipelas infections in Finland were studied in a case-control study. Bacterial isolates were characterised using both conventional and molecular typing methods, such as the emm typing, which is the most widely used typing method for beta-haemolytic streptococci. The incidence of invasive S. pyogenes disease has had an increasing trend during the past ten years in Finland, especially from 2006 onwards. Age- and sex-specific differences in the incidence rate were identified, with men having a higher incidence than women, especially among persons aged 45-64 years. In contrast, more infections occurred in women aged 25-34 years than men. Seasonal patterns with occasional peaks during the midsummer and midwinter were observed. Differences in the predisposing factors and underlying conditions of patients may contribute to these distinctions. Case fatality associated with invasive S. pyogenes infections peaked in 2005 (12%) but remained at a reasonably low level (8% overall during 2004-2007) compared to that of other developed countries (mostly exceeding 10%). Changes in the prevalent emm types were associated with the observed increases in incidence and case fatality. In the case-control study, acute bacterial non-necrotizing cellulitis was caused predominantly by Streptococcus dysgalactiae subsp. equisimilis, instead of S. pyogenes. The recurrent nature of cellulitis became evident. This study adds to our understanding of S. pyogenes infections in Finland and provides a basis for comparison to other countries and future trends. emm type surveillance and outcome analyses remain important for detecting such changes in type distribution that might lead to increases in incidence and case fatality. Bacterial characterisation serves as a basis for disease pathogenesis studies and vaccine development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: Helicobacter pylori infection, although the prevalence is declining in Western world, is still responsible for several clinically important diseases. None of the diagnostic tests is perfect and in this study, the performance of three stool antigen tests was assessed. In areas of high H. pylori prevalence, the definition of patients with the greatest benefit from eradication therapy may be a problem; the role of duodenal gastric metaplasia in categorizing patients at risk for duodenal ulcer was evaluated in this respect. Whether persistent chronic inflammation and elevated H. pylori antibodies after successful eradication are associated with each other or with atrophic gastritis, a long term sequelae of H. pylori infection, were also studied. Patients and methods: The three stool antigen tests were assessed in pre- and post-eradication settings among 364 subjects in two studies as compared to the rapid urease test (RUT), histology, culture, the 13C-urea breath test (UBT) and enzyme immunoassay (EIA) based H. pylori serology. The association between duodenal gastric metaplasia with duodenal ulcer was evaluated in a retrospective study including 1054 patients gastroscopied due to clinical indications and 154 patients previously operated for duodenal ulcer. The extent of duodenal gastric metaplasia was assessed from histological specimens in different patient groups formed on the basis of gastroscopy findings and H. pylori infection. Chronic gastric inflammation (108 patients) and H. pylori antibodies and serum markers for atrophy (77 patients) were assessed in patients earlier treated for H. pylori. Results: Of the stool antigen tests studied, the monoclonal antibody-based EIA-test showed the highest sensitivity and specificity both in the pre-treatment setting (96.9% and 95.9%) and after therapy (96.9% and 97.8%). The polyclonal stool antigen test and the in-office test had at baseline a sensitivity of 91% and 94%, and a specificity of 96% and 89%, respectively and in a post-treatment setting, a sensitivity of 78% and 91%, and a specificity of 97%, respectively. Duodenal gastric metaplasia was strongly associated with H. pylori positive duodenal ulcer (odds ratio 42). Although common still five years after eradication, persistent chronic gastric inflammation (21%) and elevated H. pylori antibodies (33%) were neither associated with each other nor with atrophic gastritis. Conclusions: Current H. pylori infection can feasibly be diagnosed by a monoclonal antibody-based EIA test with the accuracy comparable to that of reference methods. The performance of the polyclonal test as compared to the monoclonal test was inferior especially in the post-treatment setting. The in-office test had a low specificity for primary diagnosis and hence positive test results should probably be confirmed with another test before eradication therapy is prescribed. The presence of widespread duodenal gastric metaplasia showed promising results in detecting patients who should be treated for H. pylori due to an increased risk of duodenal ulcer. If serology is used later on in patients with earlier successfully treated for H. pylori, it should be taken into account that H. pylori antibodies may persist elevated for years for unknown reason. However, this phenomenon was not found to be associated with persistent chronic inflammation or atrophic changes.