997 resultados para Standardised testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper addresses the quality of the interface and edge bonded joints in layers of cross-laminated timber (CLT) panels. The shear performance was studied to assess the suitability of two different adhesives, Polyurethane (PUR) and Phenol-Resorcinol-Formaldehyde (PRF), and to determine the optimum clamping pressure. Since there is no established testing procedure to determine the shear strength of the surface bonds between layers in a CLT panel, block shear tests of specimens in two different configurations were carried out, and further shear tests of edge bonded specimen in two configurations were performed. Delamination tests were performed on samples which were subjected to accelerated aging to assess the durability of bonds in severe environmental conditions. Both tested adhesives produced boards with shear strength values within the edge bonding requirements of prEN 16351 for all manufacturing pressures. While the PUR specimens had higher shear strength values, the PRF specimens demonstrated superior durability characteristics in the delamination tests. It seems that the test protocol introduced in this study for crosslam bonded specimens, cut from a CLT panel, and placed in the shearing tool horizontally, accurately reflects the shearing strength of glue lines in CLT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report summarizes our results from security analysis covering all 57 competitions for authenticated encryption: security, applicability, and robustness (CAESAR) first-round candidates and over 210 implementations. We have manually identified security issues with three candidates, two of which are more serious, and these ciphers have been withdrawn from the competition. We have developed a testing framework, BRUTUS, to facilitate automatic detection of simple security lapses and susceptible statistical structures across all ciphers. From this testing, we have security usage notes on four submissions and statistical notes on a further four. We highlight that some of the CAESAR algorithms pose an elevated risk if employed in real-life protocols due to a class of adaptive-chosen-plaintext attacks. Although authenticated encryption with associated data are often defined (and are best used) as discrete primitives that authenticate and transmit only complete messages, in practice, these algorithms are easily implemented in a fashion that outputs observable ciphertext data when the algorithm has not received all of the (attacker-controlled) plaintext. For an implementor, this strategy appears to offer seemingly harmless and compliant storage and latency advantages. If the algorithm uses the same state for secret keying information, encryption, and integrity protection, and the internal mixing permutation is not cryptographically strong, an attacker can exploit the ciphertext–plaintext feedback loop to reveal secret state information or even keying material. We conclude that the main advantages of exhaustive, automated cryptanalysis are that it acts as a very necessary sanity check for implementations and gives the cryptanalyst insights that can be used to focus more specific attack methods on given candidates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The channel-based model of duration perception postulates the existence of neural mechanisms that respond selectively to a narrow range of stimulus durations centred on their preferred duration (Heron et al Proceedings of the Royal Society B 279 690–698). In principle the channel-based model could
explain recent reports of adaptation-induced, visual duration compression effects (Johnston et al Current Biology 16 472–479; Curran and Benton Cognition 122 252–257); from this perspective duration compression is a consequence of the adapting stimuli being presented for a longer duration than the test stimuli. In the current experiment observers adapted to a sequence of moving random dot patterns at the same retinal position, each 340ms in duration and separated by a variable (500–1000ms) interval. Following adaptation observers judged the duration of a 600ms test stimulus at the same location. The test stimulus moved in the same, or opposite, direction as the adaptor. Contrary to the channel-based
model’s prediction, test stimulus duration appeared compressed, rather than expanded, when it moved in the same direction as the adaptor. That test stimulus duration was not distorted when moving in the opposite direction further suggests that visual timing mechanisms are influenced by additional neural processing associated with the stimulus being timed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of biological tissues in the in vitro assessments of dissolving (?) microneedle (MN) array mechanical strength and subsequent drug release profiles presents some fundamental difficulties, in part due to inherent variability of the biological tissues employed. As a result, these biological materials are not appropriate for routine used in industrial formulation development or quality control (QC) tests. In the present work a facile system using Parafilm M® (PF) to test drug permeation performance using dissolving MN arrays is proposed. Dissolving MN arrays containing 196 needles (600 μm needle height) were inserted into a single layer of PF and a hermetic “pouch” was created including the array inside. The resulting system was placed in a dissolution bath and the release of model molecules was evaluated. Different MN formulations were tested using this novel setup, releasing between 40 and 180 µg of their cargos after 6 hours. The proposed system is a more realistic approach for MN testing than the typical performance test described in the literature for conventional transdermal patches. Additionally, the use of PF membrane was tested either in the hermetic “pouch” and using Franz Cell methodology yielding comparable release curves. Microscopy was used in order to ascertain the insertion of the different MN arrays in the PF layer. The proposed system appears to be a good alternative to the use of Franz cells in order to compare different MN formulations. Given the increasing industrial interest in MN technology, the proposed system has potential as a standardised drug/active agent release test for quality control purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Clostridium difficile (C. difficile) is a leading cause of infectious diarrhoea in hospitals. Sending faecal samples for testing expedites diagnosis and appropriate treatment. Clinical suspicion of C. difficile based on patient history, signs and symptoms is the basis for sampling. Sending faecal samples from patients with diarrhoea ‘just in case’ the patient has C. difficile may be an indication of poor clinical management.

Aim: To evaluate the effectiveness of an intervention by an Infection Prevention and Control Team (IPCT) in reducing inappropriate faecal samples sent for C. difficile testing.

Method: An audit of numbers of faecal samples sent before and after a decision-making algorithm was introduced. The number of samples received in the laboratory was retrospectively counted for 12-week periods before and after an algorithm was introduced.
Findings: There was a statistically significant reduction in the mean number of faecal samples sent post the algorithm. Results were compared to a similar intervention carried out in 2009 in which the same message was delivered by a memorandum. In 2009 the memorandum had no effect on the overall number of weekly samples being sent.

Conclusion: An algorithm intervention had an effect on the number of faecal samples being sent for C. difficile testing and thus contributed to the effective use of the laboratory service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Molecular testing is becoming an important part of the diagnosis of any patient with cancer. The challenge to laboratories is to meet this need, using reliable methods and processes to ensure that patients receive a timely and accurate report on which their treatment will be based. The aim of this paper is to provide minimum requirements for the management of molecular pathology laboratories. This general guidance should be augmented by the specific guidance available for different tumour types and tests. Preanalytical considerations are important, and careful consideration of the way in which specimens are obtained and reach the laboratory is necessary. Sample receipt and handling follow standard operating procedures, but some alterations may be necessary if molecular testing is to be performed, for instance to control tissue fixation. DNA and RNA extraction can be standardised and should be checked for quality and quantity of output on a regular basis. The choice of analytical method(s) depends on clinical requirements, desired turnaround time, and expertise available. Internal quality control, regular internal audit of the whole testing process, laboratory accreditation, and continual participation in external quality assessment schemes are prerequisites for delivery of a reliable service. A molecular pathology report should accurately convey the information the clinician needs to treat the patient with sufficient information to allow for correct interpretation of the result. Molecular pathology is developing rapidly, and further detailed evidence-based recommendations are required for many of the topics covered here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seafloor massive sulfide (SMS) mining will likely occur at hydrothermal systems in the near future. Alongside their mineral wealth, SMS deposits also have considerable biological value. Active SMS deposits host endemic hydrothermal vent communities, whilst inactive deposits support communities of deep water corals and other suspension feeders. Mining activities are expected to remove all large organisms and suitable habitat in the immediate area, making vent endemic organisms particularly at risk from habitat loss and localised extinction. As part of environmental management strategies designed to mitigate the effects of mining, areas of seabed need to be protected to preserve biodiversity that is lost at the mine site and to preserve communities that support connectivity among populations of vent animals in the surrounding region. These "set-aside" areas need to be biologically similar to the mine site and be suitably connected, mostly by transport of larvae, to neighbouring sites to ensure exchange of genetic material among remaining populations. Establishing suitable set-asides can be a formidable task for environmental managers, however the application of genetic approaches can aid set-aside identification, suitability assessment and monitoring. There are many genetic tools available, including analysis of mitochondrial DNA (mtDNA) sequences (e.g. COI or other suitable mtDNA genes) and appropriate nuclear DNA markers (e.g. microsatellites, single nucleotide polymorphisms), environmental DNA (eDNA) techniques and microbial metagenomics. When used in concert with traditional biological survey techniques, these tools can help to identify species, assess the genetic connectivity among populations and assess the diversity of communities. How these techniques can be applied to set-aside decision making is discussed and recommendations are made for the genetic characteristics of set-aside sites. A checklist for environmental regulators forms a guide to aid decision making on the suitability of set-aside design and assessment using genetic tools. This non-technical primer document represents the views of participants in the VentBase 2014 workshop.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulation is a well-established and effective approach to the development of fuel-efficient and low-emissions vehicles in both on-highway and off-highway applications.

The simulation of on-highway automotive vehicles is widely reported in literature, whereas research relating to non-automotive and off-highway vehicles is relatively sparse. This review paper focuses on the challenges of simulating such vehicles and discusses the differences in the approach to drive cycle testing and experimental validation of vehicle simulations. In particular, an inner-city diesel-electric hybrid bus and an ICE (Internal Combustion Engine) powered forklift truck will be used as case studies.

Computer prediction of fuel consumption and emissions of automotive vehicles on standardised drive cycles is well-established and commercial software packages such as AVL CRUISE have been specifically developed for this purpose. The vehicles considered in this review paper present new challenges from both the simulation and drive-cycle testing perspectives. For example, in the case of the forklift truck, the drive cycles involve reversing elements, variable mass, lifting operations, and do not specify a precise velocity-time profile. In particular, the difficulties associated with the prediction of productivity, i.e. the maximum rate of completing a series of defined operations, are discussed. In the case of the hybrid bus, the standardised drive cycles are unrepresentative of real-life use and alternative approaches are required in the development of efficient and low-emission vehicles.

Two simulation approaches are reviewed: the adaptation of a standard automotive vehicle simulation package, and the development of bespoke models using packages such as MATLAB/Simulink.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper demonstrates the unparalleled value of full scale data which has been acquired from ocean trials of Aquamarine Power’s Oyster 800 Wave Energy Converter (WEC) at the European Marine Energy Centre (EMEC), Orkney, Scotland.
High quality prototype and wave data were simultaneously recorded in over 750 distinct sea states (comprising different wave height, wave period and tidal height combinations) and include periods of operation where the hydraulic Power Take-Off (PTO) system was both pressurised (damped operation) and de-pressurised (undamped operation).
A detailed model-prototype correlation procedure is presented where the full scale prototype behaviour is compared to predictions from both experimental and numerical modelling techniques via a high temporal resolution wave-by-wave reconstruction. This unquestionably provides the definitive verification of the capabilities of such research techniques and facilitates a robust and meaningful uncertainty analysis to be performed on their outputs.
The importance of a good data capture methodology, both in terms of handling and accuracy is also presented. The techniques and procedures implemented by Aquamarine Power for real-time data management are discussed, including lessons learned on the instrumentation and infrastructure required to collect high-value data.