937 resultados para National Institute of Standards and Technology (U.S.). Technology Services.
Resumo:
A series of ultra-lightweight digital true random number generators (TRNGs) are presented. These TRNGs are based on the observation that, when a circuit switches from a metastable state to a bi-stable state, the resulting state may be random. Four such circuits with low hardware cost are presented: one uses an XOR gate; one uses a lookup table; one uses a multiplexer and an inverter; and one uses four transistors. The three TRNGs based on the first three circuits are implemented on a field programmable gate array and successfully pass the DIEHARD RNG tests and the National Institute of Standard and Technology (NIST) RNG tests. To the best of the authors' knowledge, the proposed TRNG designs are the most lightweight among existing TRNGs.
Resumo:
The Ohr (organic hydroperoxide resistance) family of 15-kDa Cys-based, thiol-dependent peroxidases is central to the bacterial response to stress induced by organic hydroperoxides but not by hydrogen peroxide. Ohr has a unique three-dimensional structure and requires dithiols, but not monothiols, to support its activity. However, the physiological reducing system of Ohr has not yet been identified. Here we show that lipoylated enzymes present in the bacterial extracts of Xylella fastidiosa interacted physically and functionally with this Cys-based peroxidase, whereas thioredoxin and glutathione systems failed to support Ohr peroxidase activity. Furthermore, we could reconstitute in vitro three lipoyl-dependent systems as the Ohr physiological reducing systems. We also showed that OsmC from Escherichia coli, an orthologue of Ohr from Xylella fastidiosa, is specifically reduced by lipoyl-dependent systems. These results represent the first description of a Cys-based peroxidase that is directly reduced by lipoylated enzymes.
Resumo:
BACKGROUND Oxidized lipoproteins and antioxidized low-density lipoprotein (anti-oxLDL) antibodies (Abs) have been detected in plasma in response to blood pressure (BP) elevation, suggesting the participation of the adaptive immune system. Therefore, treatment of hypertension may act on the immune response by decreasing oxidation stimuli. However, this issue has not been addressed. Thus, we have here analyzed anti-oxLDL Abs in untreated (naive) hypertensive patients shortly after initiation of anti hypertensive therapeutic regimens. METHODS Titers of anti-oxLDL Abs were measured in subjects with recently diagnosed hypertension on stage 1 (n = 94), in primary prevention of coronary disease, with no other risk factors, and naive of anti hypertensive medication at entry. Subjects were randomly assigned to receive perindopril, hydrochlorothiazide (HCTZ), or indapamide (INDA) for 12 weeks, with additional perindopril if necessary to achieve BP control. Abs against copper-oxidized LDL were measured by enzyme-linked immunosorbent assay. RESULTS Twelve-week antihypertensive treatment reduced both office-based and 24-h ambulatory BP measurements (P < 0.0005). The decrease in BP was accompanied by reduction in thiobarbituric acid-reactive substances (TBARS) (P < 0.05), increase in anti-oxLDL Ab titers (P < 0.005), and improvement in flow-mediated dilation (FMD) (P < 0.0005), independently of treatment. Although BP was reduced, we observed favorable changes in anti-oxLDL titers and FMD. CONCLUSIONS We observed that anti-oxLDL Ab titers increase after antihypertensive therapy in primary prevention when achieving BP targets. Our results are in agreement with the concept that propensity to oxidation is increased by essential hypertension and anti-oxLDL Abs may be protective and potential biomarkers for the follow-up of hypertension treatment.
Resumo:
Entanglement is an essential quantum resource for the acceleration of information processing as well as for sophisticated quantum communication protocols. Quantum information networks are expected to convey information from one place to another by using entangled light beams. We demonstrated the generation of entanglement among three bright beams of light, all of different wavelengths (532.251, 1062.102, and 1066.915 nanometers). We also observed disentanglement for finite channel losses, the continuous variable counterpart to entanglement sudden death.
Resumo:
Background: Li-Fraumeni (LFS) and Li-Fraumeni-like (LFL) syndromes are associated to germline TP53 mutations, and are characterized by the development of central nervous system tumors, sarcomas, adrenocortical carcinomas, and other early-onset tumors. Due to the high frequency of breast cancer in LFS/LFL families, these syndromes clinically overlap with hereditary breast cancer (HBC). Germline point mutations in BRCA1, BRCA2, and TP53 genes are associated with high risk of breast cancer. Large rearrangements involving these genes are also implicated in the HBC phenotype. Methods: We have screened DNA copy number changes by MLPA on BRCA1, BRCA2, and TP53 genes in 23 breast cancer patients with a clinical diagnosis consistent with LFS/LFL; most of these families also met the clinical criteria for other HBC syndromes. Results: We found no DNA copy number alterations in the BRCA2 and TP53 genes, but we detected in one patient a 36.4 Kb BRCA1 microdeletion, confirmed and further mapped by array-CGH, encompassing exons 9-19. Breakpoints sequencing analysis suggests that this rearrangement was mediated by flanking Alu sequences. Conclusion: This is the first description of a germline intragenic BRCA1 deletion in a breast cancer patient with a family history consistent with both LFL and HBC syndromes. Our results show that large rearrangements in these known cancer predisposition genes occur, but are not a frequent cause of cancer susceptibility.
Resumo:
We obtain the Paris law of fatigue crack propagation in a fuse network model where the accumulated damage in each resistor increases with time as a power law of the local current amplitude. When a resistor reaches its fatigue threshold, it burns irreversibly. Over time, this drives cracks to grow until the system is fractured into two parts. We study the relation between the macroscopic exponent of the crack-growth rate -entering the phenomenological Paris law-and the microscopic damage accumulation exponent, gamma, under the influence of disorder. The way the jumps of the growing crack, Delta a, and the waiting time between successive breaks, Delta t, depend on the type of material, via gamma, are also investigated. We find that the averages of these quantities, <Delta a > and <Delta t >/< t(r)>, scale as power laws of the crack length a, <Delta a > proportional to a(alpha) and <Delta t >/< t(r)> proportional to a(-beta), where < t(r)> is the average rupture time. Strikingly, our results show, for small values of gamma, a decrease in the exponent of the Paris law in comparison with the homogeneous case, leading to an increase in the lifetime of breaking materials. For the particular case of gamma = 0, when fatigue is exclusively ruled by disorder, an analytical treatment confirms the results obtained by simulation. Copyright (C) EPLA, 2012
Resumo:
Maize is one of the most important crops in the world. The products generated from this crop are largely used in the starch industry, the animal and human nutrition sector, and biomass energy production and refineries. For these reasons, there is much interest in figuring the potential grain yield of maize genotypes in relation to the environment in which they will be grown, as the productivity directly affects agribusiness or farm profitability. Questions like these can be investigated with ecophysiological crop models, which can be organized according to different philosophies and structures. The main objective of this work is to conceptualize a stochastic model for predicting maize grain yield and productivity under different conditions of water supply while considering the uncertainties of daily climate data. Therefore, one focus is to explain the model construction in detail, and the other is to present some results in light of the philosophy adopted. A deterministic model was built as the basis for the stochastic model. The former performed well in terms of the curve shape of the above-ground dry matter over time as well as the grain yield under full and moderate water deficit conditions. Through the use of a triangular distribution for the harvest index and a bivariate normal distribution of the averaged daily solar radiation and air temperature, the stochastic model satisfactorily simulated grain productivity, i.e., it was found that 10,604 kg ha(-1) is the most likely grain productivity, very similar to the productivity simulated by the deterministic model and for the real conditions based on a field experiment.
Resumo:
Background: The activation of innate immune responses by Plasmodium vivax results in activation of effector cells and an excessive production of pro-inflammatory cytokines that may culminate in deleterious effects. Here, we examined the activation and function of neutrophils during acute episodes of malaria. Materials and Methods: Blood samples were collected from P. vivax-infected patients at admission (day 0) and 30-45 days after treatment with chloroquine and primaquine. Expression of activation markers and cytokine levels produced by highly purified monocytes and neutrophils were measured by the Cytometric Bead Assay. Phagocytic activity, superoxide production, chemotaxis and the presence of G protein-coupled receptor (GRK2) were also evaluated in neutrophils from malaria patients. Principal Findings: Both monocytes and neutrophils from P. vivax-infected patients were highly activated. While monocytes were found to be the main source of cytokines in response to TLR ligands, neutrophils showed enhanced phagocytic activity and superoxide production. Interestingly, neutrophils from the malaria patients expressed high levels of GRK2, low levels of CXCR2, and displayed impaired chemotaxis towards IL-8 (CXCL8). Conclusion: Activated neutrophils from malaria patients are a poor source of pro-inflammatory cytokines and display reduced chemotactic activity, suggesting a possible mechanism for an enhanced susceptibility to secondary bacterial infection during malaria.
Resumo:
Dengue is the most prevalent arboviral infection, affecting millions of people every year. Attempts to control such infection are being made, and the development of a vaccine is a World Health Organization priority. Among the proteins being tested as vaccine candidates in preclinical settings is the non-structural protein 1 (NS1). In the present study, we tested the immune responses generated by targeting the NS1 protein to two different dendritic cell populations. Dendritic cells (DCs) are important antigen presenting cells, and targeting proteins to maturing DCs has proved to be an efficient means of immunization. Antigen targeting is accomplished by the use of a monoclonal antibody (mAb) directed against a DC cell surface receptor fused to the protein of interest. We used two mAbs (αDEC205 and αDCIR2) to target two distinct DC populations, expressing either DEC205 or DCIR2 endocytic receptors, respectively, in mice. The fusion mAbs were successfully produced, bound to their respective receptors, and were used to immunize BALB/c mice in the presence of polyriboinosinic: polyribocytidylic acid (poly (I:C)), as a DC maturation stimulus. We observed induction of strong anti-NS1 antibody responses and similar antigen binding affinity irrespectively of the DC population targeted. Nevertheless, the IgG1/IgG2a ratios were different between mouse groups immunized with αDEC-NS1 and αDCIR2-NS1 mAbs. When we tested the induction of cellular immune responses, the number of IFN-γ producing cells was higher in αDEC-NS1 immunized animals. In addition, mice immunized with the αDEC-NS1 mAb were significantly protected from a lethal intracranial challenge with the DENV2 NGC strain when compared to mice immunized with αDCIR2-NS1 mAb. Protection was partially mediated by CD4(+) and CD8(+) T cells as depletion of these populations reduced both survival and morbidity signs. We conclude that targeting the NS1 protein to the DEC205(+) DC population with poly (I:C) opens perspectives for dengue vaccine development.
Resumo:
In questa tesi vengono introdotte le idee chiave della moderna crittografia, presentato il cifrario perfetto one-time pad e mostrate le criticità che l’utilizzo di tale schema presenta. L’idea di schemi di cifratura matematicamente sicuri deve essere superata a favore di schemi computazionalmente sicuri. A questo proposito diventa cruciale il concetto di pseudocasualità: come la sicurezza computazionale è un indebolimento della sicurezza perfetta, così la pseudocasualità è un indebolimento della pura casualità. E' quindi necessario avere dei metodi per definire la bontà di un generatore pseudocasuale di numeri. Il National Institute of Standars and Technology fornisce alcuni criteri per caratterizzare e scegliere generatori appropriati sulla base di test statistici. Alcuni di questi test sono stati implementati all’interno del portale di apprendimento CrypTool sviluppato da alcune Università e centri di ricerca di Germania e Austria.
Resumo:
La firma digitale è uno degli sviluppi più importanti della crittografia a chiave pubblica, che permette di implementarne le funzionalità di sicurezza. La crittografia a chiave pubblica, introdotta nel 1976 da Diffie ed Hellman, è stata l'unica grande rivoluzione nella storia della crittografia. Si distacca in modo radicale da ciò che l'ha preceduta, sia perché i suoi algoritmi si basano su funzioni matematiche e non su operazioni di sostituzione e permutazione, ma sopratutto perché è asimmetrica: prevede l'uso di due chiavi distinte (mentre nelle crittografia simmetrica si usa una sola chiave condivisa tra le parti). In particolare, le funzioni matematiche su cui si basa tale crittografia sono funzioni ben note nella Teoria dei Numeri: ad esempio fattorizzazione, calcolo del logaritmo discreto. La loro importanza deriva dal fatto che si ritiene che siano 'computazionalmente intrattabili' da calcolare. Dei vari schemi per la firma digitale basati sulla crittografia a chiave pubblica, si è scelto di studiare quello proposto dal NIST (National Institute of Standard and Technology): il Digital Signature Standard (DSS), spesso indicato come DSA (Digital Signature Algorithm) dal nome dell'algoritmo che utilizza. Il presente lavoro è strutturato in tre capitoli. Nel Capitolo 1 viene introdotto il concetto di logaritmo discreto (centrale nell'algoritmo DSA) e vengono mostrati alcuni algoritmi per calcolarlo. Nel Capitolo 2, dopo una panoramica sulla crittografia a chiave pubblica, si dà una definizione di firma digitale e delle sue caratteristiche. Chiude il capitolo una spiegazione di un importante strumento utilizzato negli algoritmi di firma digitale: le funzioni hash. Nel Capitolo 3, infine, si analizza nel dettaglio il DSA nelle tre fasi che lo costituiscono (inizializzazione, generazione, verifica), mostrando come il suo funzionamento e la sua sicurezza derivino dai concetti precedentemente illustrati.
Resumo:
Mode of access: Internet.
Resumo:
Mode of access: Internet.
Resumo:
Digital technology offers enormous benefits (economic, quality of design and efficiency in use) if adopted to implement integrated ways of representing the physical world in a digital form. When applied across the full extent of the built and natural world, it is referred to as the Digital Built Environment (DBE) and encompasses a wide range of approaches and technology initiatives, all aimed at the same end goal: the development of a virtual world that sufficiently mirrors the real world to form the basis for the smart cities of the present and future, enable efficient infrastructure design and programmed maintenance, and create a new foundation for economic growth and social well-being through evidence-based analysis. The creation of a National Data Policy for the DBE will facilitate the creation of additional high technology industries in Australia; provide Governments, industries and citizens with greater knowledge of the environments they occupy and plan; and offer citizen-driven innovations for the future. Australia has slipped behind other nations in the adoption and execution of Building Information Modelling (BIM) and the principal concern is that the gap is widening. Data driven innovation added $67 billion to the Australian economy in 20131. Strong open data policy equates to $16 billion in new value2. Australian Government initiatives such as the Digital Earth inspired “National Map” offer a platform and pathway to embrace the concept of a “BIM Globe”, while also leveraging unprecedented growth in open source / open data collaboration. Australia must address the challenges by learning from international experiences—most notably the UK and NZ—and mandate the use of BIM across Government, extending the Framework for Spatial Data Foundation to include the Built Environment as a theme and engaging collaboration through a “BIM globe” metaphor. This proposed DBE strategy will modernise the Australian urban planning and the construction industry. It will change the way we develop our cities by fundamentally altering the dynamics and behaviours of the supply chains and unlocking new and more efficient ways of collaborating at all stages of the project life-cycle. There are currently two major modelling approaches that contribute to the challenge of delivering the DBE. Though these collectively encompass many (often competing) approaches or proprietary software systems, all can be categorised as either: a spatial modelling approach, where the focus is generally on representing the elements that make up the world within their geographic context; and a construction modelling approach, where the focus is on models that support the life cycle management of the built environment. These two approaches have tended to evolve independently, addressing two broad industry sectors: the one concerned with understanding and managing global and regional aspects of the world that we inhabit, including disciplines concerned with climate, earth sciences, land ownership, urban and regional planning and infrastructure management; the other is concerned with planning, design, construction and operation of built facilities and includes architectural and engineering design, product manufacturing, construction, facility management and related disciplines (a process/technology commonly known as Building Information Modelling, BIM). The spatial industries have a strong voice in the development of public policy in Australia, while the construction sector, which in 2014 accounted for around 8.5% of Australia’s GDP3, has no single voice and because of its diversity, is struggling to adapt to and take advantage of the opportunity presented by these digital technologies. The experience in the UK over the past few years has demonstrated that government leadership is very effective in stimulating industry adoption of digital technologies by, on the one hand, mandating the use of BIM on public procurement projects while at the same time, providing comparatively modest funding to address the common issues that confront the industry in adopting that way of working across the supply chain. The reported result has been savings of £840m in construction costs in 2013/14 according to UK Cabinet Office figures4. There is worldwide recognition of the value of bringing these two modelling technologies together. Australia has the expertise to exercise leadership in this work, but it requires a commitment by government to recognise the importance of BIM as a companion methodology to the spatial technologies so that these two disciplinary domains can cooperate in the development of data policies and information exchange standards to smooth out common workflows. buildingSMART Australasia, SIBA and their academic partners have initiated this dialogue in Australia and wish to work collaboratively, with government support and leadership, to explore the opportunities open to us as we develop an Australasian Digital Built Environment. As part of that programme, we must develop and implement a strategy to accelerate the adoption of BIM processes across the Australian construction sector while at the same time, developing an integrated approach in concert with the spatial sector that will position Australia at the forefront of international best practice in this area. Australia and New Zealand cannot afford to be on the back foot as we face the challenges of rapid urbanisation and change in the global environment. Although we can identify some exemplary initiatives in this area, particularly in New Zealand in response to the need for more resilient urban development in the face of earthquake threats, there is still much that needs to be done. We are well situated in the Asian region to take a lead in this challenge, but we are at imminent risk of losing the initiative if we do not take action now. Strategic collaboration between Governments, Industry and Academia will create new jobs and wealth, with the potential, for example, to save around 20% on the delivery costs of new built assets, based on recent UK estimates.