80 resultados para Analyst
Resumo:
The potential of IR absorption and Raman spectroscopy for rapid identification of novel psychoactive substances (NPS) has been tested using a set of 221 unsorted seized samples suspected of containing NPS. Both IR and Raman spectra showed large variation between the different sub-classifications of NPS and smaller, but still distinguishable, differences between closely related compounds within the same class. In initial tests, screening the samples using spectral searching against a limited reference library allowed only 41% of the samples to be fully identified. The limiting factor in the identification was the large number of active compounds in the seized samples for which no reference vibrational data were available in the libraries rather than poor spectral quality. Therefore, when 33 of these compounds were independently identified by NMR and mass spectrometry and their spectra used to extend the libraries, the percentage of samples identified by IR and Raman screening alone increased to 76%, with only 7% of samples having no identifiable constituents. This study, which is the largest of its type ever carried out, therefore demonstrates that this approach of detecting non-matching samples and then identifying them using standard analytical methods has considerable potential in NPS screening since it allows rapid identification of the constituents of the majority of street quality samples. Only one complete feedback cycle was carried out in this study but there is clearly the potential to carry out continuous identification/updating when this system is used in operational settings.
Resumo:
This study investigates topology optimization of energy absorbing structures in which material damage is accounted for in the optimization process. The optimization objective is to design the lightest structures that are able to absorb the required mechanical energy. A structural continuity constraint check is introduced that is able to detect when no feasible load path remains in the finite element model, usually as a result of large scale fracture. This assures that designs do not fail when loaded under the conditions prescribed in the design requirements. This continuity constraint check is automated and requires no intervention from the analyst once the optimization process is initiated. Consequently, the optimization algorithm proceeds towards evolving an energy absorbing structure with the minimum structural mass that is not susceptible to global structural failure. A method is also introduced to determine when the optimization process should halt. The method identifies when the optimization method has plateaued and is no longer likely to provide improved designs if continued for further iterations. This provides the designer with a rational method to determine the necessary time to run the optimization and avoid wasting computational resources on unnecessary iterations. A case study is presented to demonstrate the use of this method.
Resumo:
Virtual topology operations have been utilized to generate an analysis topology definition suitable for downstream mesh generation. Detailed descriptions are provided for virtual topology merge and split operations for all topological entities. Current virtual topology technology is extended to allow the virtual partitioning of volume cells and the topological queries required to carry out each operation are provided. Virtual representations are robustly linked to the underlying geometric definition through an analysis topology. The analysis topology and all associated virtual and topological dependencies are automatically updated after each virtual operation, providing the link to the underlying CAD geometry. Therefore, a valid description of the analysis topology, including relative orientations, is maintained. This enables downstream operations, such as the merging or partitioning of virtual entities, and interrogations, such as determining if a specific meshing strategy can be applied to the virtual volume cells, to be performed on the analysis topology description. As the virtual representation is a non-manifold description of the sub-divided domain the interfaces between cells are recorded automatically. This enables the advantages of non-manifold modelling to be exploited within the manifold modelling environment of a major commercial CAD system, without any adaptation of the underlying CAD model. A hierarchical virtual structure is maintained where virtual entities are merged or partitioned. This has a major benefit over existing solutions as the virtual dependencies are stored in an open and accessible manner, providing the analyst with the freedom to create, modify and edit the analysis topology in any preferred sequence, whilst the original CAD geometry is not disturbed. Robust definitions of the topological and virtual dependencies enable the same virtual topology definitions to be accessed, interrogated and manipulated within multiple different CAD packages and linked to the underlying geometry.
Resumo:
Highly swellable polymer films doped with Ag nanoparticle aggregates (poly-SERS films) have been used to record very high signal:noise ratio, reproducible surface-enhanced (resonance) Raman (SER(R)S) spectra of in situ dried ink lines and their constituent dyes using both 633 and 785 nm excitation. These allowed the chemical origins of differences in the SERRS spectra of different inks to be determined. Initial investigation of pure samples of the 10 most common blue dyes showed that the dyes which had very similar chemical structures such as Patent Blue V and Patent Blue VF (which differ only by a single OH group) gave SERRS spectra in which the only indications that the dye structure had been changed were small differences in peak positions or relative intensities of the bands. SERRS studies of 13 gel pen inks were consistent with this observation. In some cases inks from different types of pens could be distinguished even though they were dominated by a single dye such as Victoria Blue B (Zebra Surari) or Victoria Blue BO (Pilot Acroball) because their predominant dye did not appear in other inks. Conversely, identical spectra were also recorded from different types of pens (Pilot G7, Zebra Z-grip) because they all had the same dominant Brilliant Blue G dye. Finally, some of the inks contained mixtures of dyes which could be separated by TLC and removed from the plate before being analysed with the same poly-SERS films. For example, the Pentel EnerGel ink pen was found to give TLC spots corresponding to Erioglaucine and Brilliant Blue G. Overall, this study has shown that the spectral differences between different inks which are based on chemically similar, but nonetheless distinct dyes, are extremely small, so very close matches between SERRS spectra are required for confident identification. Poly-SERS substrates can routinely provide the very stringent reproducibility and sensitivity levels required. This, coupled with the awareness of the reasons underlying the observed differences between similarly coloured inks allows a more confident assessment of the evidential value of inks SERS and should underpin adoption of this approach as a routine method for the forensic examination of inks.
Resumo:
Key Performance Indicators (KPIs) and their predictions are widely used by the enterprises for informed decision making. Nevertheless , a very important factor, which is generally overlooked, is that the top level strategic KPIs are actually driven by the operational level business processes. These two domains are, however, mostly segregated and analysed in silos with different Business Intelligence solutions. In this paper, we are proposing an approach for advanced Business Simulations, which converges the two domains by utilising process execution & business data, and concepts from Business Dynamics (BD) and Business Ontologies, to promote better system understanding and detailed KPI predictions. Our approach incorporates the automated creation of Causal Loop Diagrams, thus empowering the analyst to critically examine the complex dependencies hidden in the massive amounts of available enterprise data. We have further evaluated our proposed approach in the context of a retail use-case that involved verification of the automatically generated causal models by a domain expert.