179 resultados para 720501 Defence standards and calibrations
Resumo:
For TREC Crowdsourcing 2011 (Stage 2) we propose a networkbased approach for assigning an indicative measure of worker trustworthiness in crowdsourced labelling tasks. Workers, the gold standard and worker/gold standard agreements are modelled as a network. For the purpose of worker trustworthiness assignment, a variant of the PageRank algorithm, named TurkRank, is used to adaptively combine evidence that suggests worker trustworthiness, i.e., agreement with other trustworthy co-workers and agreement with the gold standard. A single parameter controls the importance of co-worker agreement versus gold standard agreement. The TurkRank score calculated for each worker is incorporated with a worker-weighted mean label aggregation.
Resumo:
Contemporary lipidomics protocols are dependent on conventional tandem mass spectrometry for lipid identification. This approach is extremely powerful for determining lipid class and identifying the number of carbons and the degree of unsaturation of any acyl-chain substituents. Such analyses are however, blind to isomeric variants arising from different carbon carbon bonding motifs within these chains including double bond position, chain branching, and cyclic structures. This limitation arises from the fact that conventional, low energy collision-induced dissociation of even-electron lipid ions does not give rise to product ions from intrachain fragmentation of the fatty acyl moieties. To overcome this limitation, we have applied radical-directed dissociation (RDD) to the study of lipids for the first time. In this approach, bifunctional molecules that contain a photocaged radical initiator and a lipid-adducting group, such as 4-iodoaniline and 4-iodobenzoic acid, are used to form noncovalent complexes (i.e., adduct ions) with a lipid during electrospray ionization. Laser irradiation of these complexes at UV wavelengths (266 nm) cleaves the carbon iodine bond to liberate a highly reactive phenyl radical. Subsequent activation of the nascent radical ions results in RDD with significant intrachain fragmentation of acyl moieties. This approach provides diagnostic fragments that are associated with the double bond position and the positions of chain branching in glycerophospholipids, sphingomyelins and triacylglycerols and thus can be used to differentiate isomeric lipids differing only in such motifs. RDD is demonstrated for well-defined lipid standards and also reveals lipid structural diversity in olive oil and human very-low density lipoprotein.
Resumo:
Two sources of uncertainty in the X ray computed tomography imaging of polymer gel dosimeters are investigated in the paper.The first cause is a change in postirradiation density, which is proportional to the computed tomography signal and is associated with a volume change. The second cause of uncertainty is reconstruction noise.A simple technique that increases the residual signal to noise ratio by almost two orders of magnitude is examined.
Resumo:
One of the most important parts of any Bridge Management System (BMS) is the condition assessment and rating of bridges. This paper, introduces a procedure for condition assessment, based on criticality and vulnerability analysis. According to this procedure, new rating equations are developed. The inventory data is used to determine the contribution of different critical factors such as environmental effects, flood, earthquake, wind, and vehicle impacts. The criticality of the components to live load and vulnerability of the components to the above critical factors are identified. Based on the criticality and the vulnerability of the components and criticality of factors, and by using the new rating equations, the condition assessment and the rating of the railway bridges and their components at the network level will be conducted. This method for the first time incorporates structural analysis, available knowledge of risk assessment in structural engineering standards, and the experience of structural engineers in a practical way to enhance the reliability of the condition assessment and rating a network of bridges.
Resumo:
The purpose of this paper is to take a critical look at the question “what is a competent project manager?” and bring some fresh added-value insights. This leads us to analyze the definitions, and assessment approaches of project manager competence. Three major standards as prescribed by PMI, IPMA, and GAPPS are considered for review from an attribute-based and performance-based approach and from a deontological and consequentialist ethics perspectives. Two fundamental tensions are identified: an ethical tension between the standards and the related competence assessment frameworks and a tension between attribute and performance-based approaches. Aristotelian ethical and practical philosophy is brought in to reconcile these differences. Considering ethics of character that rises beyond the normative deontological and consequentialist perspectives is suggested. Taking the mediating role of praxis and phrónêsis between theory and practice into consideration is advocated to resolve the tension between performance and attribute-based approaches to competence assessment.
Resumo:
Post-earthquake fire (PEF) is considered one of the most high risk and complicated problems affecting buildings in urban areas and can cause even more damage than the earthquake itself. However, most standards and codes ignore the implications of PEF and so buildings are not normally designed with PEF in mind. What is needed is for PEF factors to be routinely scrutinized and codified as part of the design process. A systematic application is presented as a means of mitigating the risk of PEF in urban buildings. This covers both existing buildings, in terms of retrofit solutions, and those yet to be designed, where a PEF factor is proposed. To ensure the mitigation strategy meets the defined criteria, a minimum time is defined – the safety guaranteed time target – where the safety of the inhabitants in a building is guaranteed.
Resumo:
A combined experimental and numerical program was conducted to study the in-plane shear behaviour of hollow concrete masonry panels containing reinforced grout cores. This paper is focused on the numerical program. A two dimensional macromodelling strategy was used to simulate the behaviour of the confined masonry (CM) shear panels. Both the unreinforced masonry and the confining element were modelled using macromasonry properties and the steel reinforcement was modelled as an embedded truss element located within the grout using perfectly bonded constraint. The FE model reproduced key behaviours observed in the experiments, including the shear strength, the deformation and the crack patterns of the unconfined and confined masonry panels. The predictions of the validated model were used to evaluate the existing in-plane shear expressions available in the national masonry standards and research publications.
Resumo:
The US National Institute of Standards and Technology (NIST) showed that, in 2004, owners and operations managers bore two thirds of the total industry cost burden from inadequate interoperability in construction projects from inception to operation, amounting to USD10.6 billion. Building Information Modelling (BIM) and similar tools were identified by Engineers Australia in 2005 as potential instruments to significantly reduce this sum, which in Australia could amount to total industry-wide cost burden of AUD12 billion. Public sector road authorities in Australia have a key responsibility in driving initiatives to reduce greenhouse gas emissions from the construction and operations of transport infrastructure. However, as previous research has shown the Environmental Impact Assessment process, typically used for project approvals and permitting based on project designs available at the consent stage, lacks Key Performance Indicators (KPIs) that include long-term impact factors and transfer of information throughout the project life cycle. In the building construction industry, BIM is widely used to model sustainability KPIs such as energy consumption, and integrated with facility management systems. This paper proposes that a similar use of BIM in early design phases of transport infrastructure could provide: (i) productivity gains through improved interoperability and documentation; (ii) the opportunity to carry out detailed cost-benefit analyses leading to significant operational cost savings; (iii) coordinated planning of street and highway lighting with other energy and environmental considerations; iv) measurable KPIs that include long-term impact factors which are transferable throughout the project life cycle; and (v) the opportunity for integrating design documentation with sustainability whole-of-life targets.
Resumo:
Cold-formed steel sections are commonly used in low-rise commercial and residential buildings. During fire events, cold-formed steel structural elements in these buildings can be exposed to elevated temperatures. Hence after such events there is a need to evaluate their residual strengths. However, only limited information is available in relation to the residual strength of fire exposed cold-formed steel sections. This research is aimed at investigating the distortional buckling capacities of fire exposed cold-formed lipped channel sections. A series of compression tests of fire exposed, short lipped channel columns made of varying steel grades and thicknesses was undertaken in this research. Test columns were first exposed to different elevated temperatures up to 800 oC, and then tested to failure after cooling down. Suitable finite element models were developed with post-fire mechanical properties to simulate the behaviour of tested columns and were validated using test results. The residual compression capacities of short columns were also predicted using the current cold-formed steel standards and compared with test and finite element analysis results. This comparison showed that ambient temperature design rules for columns can be used to predict the residual compression capacities of fire exposed short or laterally restrained cold-formed steel columns provided the maximum temperature experienced by the column can be estimated after a fire event. Such residual capacity assessments will allow engineers to evaluate the safety of fire exposed buildings. This paper presents the details of this experimental study, finite element analyses and the results.
Resumo:
In this study, the biodiesel properties and effects of blends of oil methyl ester petroleum diesel on a CI direct injection diesel engine is investigated. Blends were obtained from the marine dinoflagellate Crypthecodinium cohnii and waste cooking oil. The experiment was conducted using a four-cylinder, turbo-charged common rail direct injection diesel engine at four loads (25%, 50%, 75% and 100%). Three blends (10%, 20% and 50%) of microalgae oil methyl ester and a 20% blend of waste cooking oil methyl ester were compared to petroleum diesel. To establish suitability of the fuels for a CI engine, the effects of the three microalgae fuel blends at different engine loads were assessed by measuring engine performance, i.e. mean effective pressure (IMEP), brake mean effective pressure (BMEP), in cylinder pressure, maximum pressure rise rate, brake-specific fuel consumption (BSFC), brake thermal efficiency (BTE), heat release rate and gaseous emissions (NO, NOx,and unburned hydrocarbons (UHC)). Results were then compared to engine performance characteristics for operation with a 20% waste cooking oil/petroleum diesel blend and petroleum diesel. In addition, physical and chemical properties of the fuels were measured. Use of microalgae methyl ester reduced the instantaneous cylinder pressure and engine output torque, when compared to that of petroleum diesel, by a maximum of 4.5% at 50% blend at full throttle. The lower calorific value of the microalgae oil methyl ester blends increased the BSFC, which ultimately reduced the BTE by up to 4% at higher loads. Minor reductions of IMEP and BMEP were recorded for both the microalgae and the waste cooking oil methyl ester blends at low loads, with a maximum of 7% reduction at 75% load compared to petroleum diesel. Furthermore, compared to petroleum diesel, gaseous emissions of NO and NOx, increased for operations with biodiesel blends. At full load, NO and NOx emissions increased by 22% when 50% microalgae blends were used. Petroleum diesel and a 20% blend of waste cooking oil methyl ester had emissions of UHC that were similar, but those of microalgae oil methyl ester/petroleum diesel blends were reduced by at least 50% for all blends and engine conditions. The tested microalgae methyl esters contain some long-chain, polyunsaturated fatty acid methyl esters (FAMEs) (C22:5 and C22:6) not commonly found in terrestrial-crop-derived biodiesels yet all fuel properties were satisfied or were very close to the ASTM 6751-12 and EN14214 standards. Therefore, Crypthecodinium cohnii- derived microalgae biodiesel/petroleum blends of up to 50% are projected to meet all fuel property standards and, engine performance and emission results from this study clearly show its suitability for regular use in diesel engines.
Resumo:
This paper offers a definition of elite media arguing their content focus will sufficiently meet social responsibility needs of democracy. Its assumptions come from the Finkelstein and Leveson Inquiries and regulatory British Royal Charter (2013). These provide guidelines on how media outlets meet ‘social responsibility’ standards, e.g. press has a ‘responsibility to be fair and accurate’ (Finkelstein); ethical press will feel a responsibility to ‘hold power to account’ (Leveson); news media ‘will be held strictly accountable’ (RC). The paper invokes the British principle of media opting-in to observe standards, and so serve the democracy. It will give examples from existing media, and consider social responsibility of media more generally. Obvious cases of ‘quality’ media: public broadcasters, e.g. BBC, Al-Jazeera, and ‘quality’ press, e.g. NYT, Süddeutscher Zeitung, but also community broadcasters, specialised magazines, news agencies, distinctive web logs, and others. Where providing commentary, these abjure gratuitous opinion -- meeting a standard of reasoned, informational and fair. Funding is almost a definer, many such services supported by the state, private trusts, public institutions or volunteering by staff. Literature supporting discussion on elite media will include their identity as primarily committed to a public good, e.g. the ‘Public Value Test’, Moe and Donders (2011); with reference also to recent literature on developing public service media. Within its limits the paper will treat social media as participants among all media, including elite, and as a parallel dimension of mass communication founded on inter-activity. Elite media will fulfil the need for social responsibility, firstly by providing one space, a ‘plenary’ for debate. Second is the notion of building public recognition of elite media as trustworthy. Third is the fact that elite media together are a large sector with resources to sustain social cohesion and debate; notwithstanding pressure on funds, and impacts of digital transformation undermining employment in media more than in most industries.
Resumo:
Traceability system in the food supply chain is becoming more necessary. RFID and EPCglobal Network Standards are emerging technologies that bring new opportunities to develop the high performance traceability system. This research proposes the analysis, design, and development of the RFID and EPCglobal Network Standards based traceability system that adheres to the requirements of global food traceability in terms of completeness of traceability information. The additional components, including lot management system and electronic transaction management system, encourage the traditional system in order to fulfill the missing information. The proposed system was developed and applied in a rice supply chain. Results from experimentation showed that the additional components can significantly improve the completeness of traceability information. The collaboration between EPCglobal Network Standards and electronic transaction management system can improve the performances in RFID operations.
Resumo:
At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).
Resumo:
The Grøstl Hash Function Grøstl is FAST Grøstl is PROVABLY SECURE Grøstl is SIDE-CHANNEL RESISTANT Grøstl is SIMPLE
Resumo:
This paper reports the details of an experimental study of cold-formed steel hollow section columns at ambient and elevated temperatures. In this study the global buckling behaviour of cold-formed Square Hollow Section (SHS) slender columns under axial compression was investigated at various uniform elevated temperatures up to 700℃. The results of these column tests are reported in this paper, which include the buckling/failure modes at elevated temperatures, and ultimate load versus temperature curves. Finite element models of tested columns were also developed and their behaviour and ultimate capacities at ambient and elevated temperatures were studied. Fire design rules given in European and American standards including the Direct Strength Method (DSM) based design rules were used to predict the ultimate capacities of tested columns at elevated temperatures. Elevated temperature mechanical properties and stress-strain models given in European steel design standards and past researches were used with design rules and finite element models to investigate their effects on SHS column capacities. Comparisons of column capacities from tests and finite element analyses with those predicted by current design rules were used to determine the accuracy of currently available column design rules in predicting the capacities of SHS columns at elevated temperatures and the need to use appropriate elevated temperature material stress-strain models. This paper presents the important findings derived from the comparisons of these column capacities.