937 resultados para National Institute of Standards and Technology (U.S.). Technology Services.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Description based on: 1981.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Imprint varies: Washington, D.C., 1984-19 ; Gaithersburg, MD <1996->

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"July 1996."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatic spoken Language Identi¯cation (LID) is the process of identifying the language spoken within an utterance. The challenge that this task presents is that no prior information is available indicating the content of the utterance or the identity of the speaker. The trend of globalization and the pervasive popularity of the Internet will amplify the need for the capabilities spoken language identi¯ca- tion systems provide. A prominent application arises in call centers dealing with speakers speaking di®erent languages. Another important application is to index or search huge speech data archives and corpora that contain multiple languages. The aim of this research is to develop techniques targeted at producing a fast and more accurate automatic spoken LID system compared to the previous National Institute of Standards and Technology (NIST) Language Recognition Evaluation. Acoustic and phonetic speech information are targeted as the most suitable fea- tures for representing the characteristics of a language. To model the acoustic speech features a Gaussian Mixture Model based approach is employed. Pho- netic speech information is extracted using existing speech recognition technol- ogy. Various techniques to improve LID accuracy are also studied. One approach examined is the employment of Vocal Tract Length Normalization to reduce the speech variation caused by di®erent speakers. A linear data fusion technique is adopted to combine the various aspects of information extracted from speech. As a result of this research, a LID system was implemented and presented for evaluation in the 2003 Language Recognition Evaluation conducted by the NIST.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many substation applications require accurate time-stamping. The performance of systems such as Network Time Protocol (NTP), IRIG-B and one pulse per second (1-PPS) have been sufficient to date. However, new applications, including IEC 61850-9-2 process bus and phasor measurement, require accuracy of one microsecond or better. Furthermore, process bus applications are taking time synchronisation out into high voltage switchyards where cable lengths may have an impact on timing accuracy. IEEE Std 1588, Precision Time Protocol (PTP), is the means preferred by the smart grid standardisation roadmaps (from both the IEC and US National Institute of Standards and Technology) of achieving this higher level of performance, and integrates well into Ethernet based substation automation systems. Significant benefits of PTP include automatic path length compensation, support for redundant time sources and the cabling efficiency of a shared network. This paper benchmarks the performance of established IRIG-B and 1-PPS synchronisation methods over a range of path lengths representative of a transmission substation. The performance of PTP using the same distribution system is then evaluated and compared to the existing methods to determine if the performance justifies the additional complexity. Experimental results show that a PTP timing system maintains the synchronising performance of 1-PPS and IRIG-B timing systems, when using the same fibre optic cables, and further meets the needs of process buses in large substations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topic of “the cloud” has attracted significant attention throughout the past few years (Cherry 2009; Sterling and Stark 2009) and, as a result, academics and trade journals have created several competing definitions of “cloud computing” (e.g., Motahari-Nezhad et al. 2009). Underpinning this article is the definition put forward by the US National Institute of Standards and Technology, which describes cloud computing as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction” (Garfinkel 2011, p. 3). Despite the lack of consensus about definitions, however, there is broad agreement on the growing demand for cloud computing. Some estimates suggest that spending on cloudrelated technologies and services in the next few years may climb as high as USD 42 billion/year (Buyya et al. 2009).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For TREC Crowdsourcing 2011 (Stage 2) we propose a networkbased approach for assigning an indicative measure of worker trustworthiness in crowdsourced labelling tasks. Workers, the gold standard and worker/gold standard agreements are modelled as a network. For the purpose of worker trustworthiness assignment, a variant of the PageRank algorithm, named TurkRank, is used to adaptively combine evidence that suggests worker trustworthiness, i.e., agreement with other trustworthy co-workers and agreement with the gold standard. A single parameter controls the importance of co-worker agreement versus gold standard agreement. The TurkRank score calculated for each worker is incorporated with a worker-weighted mean label aggregation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The US National Institute of Standards and Technology (NIST) showed that, in 2004, owners and operations managers bore two thirds of the total industry cost burden from inadequate interoperability in construction projects from inception to operation, amounting to USD10.6 billion. Building Information Modelling (BIM) and similar tools were identified by Engineers Australia in 2005 as potential instruments to significantly reduce this sum, which in Australia could amount to total industry-wide cost burden of AUD12 billion. Public sector road authorities in Australia have a key responsibility in driving initiatives to reduce greenhouse gas emissions from the construction and operations of transport infrastructure. However, as previous research has shown the Environmental Impact Assessment process, typically used for project approvals and permitting based on project designs available at the consent stage, lacks Key Performance Indicators (KPIs) that include long-term impact factors and transfer of information throughout the project life cycle. In the building construction industry, BIM is widely used to model sustainability KPIs such as energy consumption, and integrated with facility management systems. This paper proposes that a similar use of BIM in early design phases of transport infrastructure could provide: (i) productivity gains through improved interoperability and documentation; (ii) the opportunity to carry out detailed cost-benefit analyses leading to significant operational cost savings; (iii) coordinated planning of street and highway lighting with other energy and environmental considerations; iv) measurable KPIs that include long-term impact factors which are transferable throughout the project life cycle; and (v) the opportunity for integrating design documentation with sustainability whole-of-life targets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There exists a minimum in the Waring function, psi(T) = -d(ln p)/d(1/T), and in the Riedel function, alpha(T) = d(ln p)/d(In T), in the liquid-vapor coexistence curve for most fluids. By analyzing National Institute of Standards and Technology data for the molar enthalpy of vaporization and the compressibility variation at the liquid-vapor phase change of 105 fluids, we find that the temperatures of these minima are linearly correlated with the critical temperature, T-c. Using reduced coordinates, we also demonstrate that the minima are well-correlated with the acentric factor. These correlations are used for testing four well-known vapor pressure equations in the Pitzer corresponding states scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article concerns an investigation of the full scale evacuation of a building with a configuration similar to that of the World Trade Center (WTC) North Tower using computer simulation. A range of evacuation scenarios is explored in order to better understand the evacuation of the WTC on 11 September 2001. The analysis makes use of response time data derived from a study of published WTC survivor accounts. Geometric details of the building are obtained from architects' plans while the total building population used in the scenarios is based on estimates produced by the National Institute of Standards and Technology formal investigation into the evacuation. This paper attempts to approximate the events of 11 September 2001 and pursue several `what if' questions concerning the evacuation. In particular, the study explores the likely outcome had a single staircase survived intact from top to bottom. More generally, this paper explores issues associated with the practical limits of building size that can be expected to be efficiently evacuated using stairs alone.