922 resultados para Measurement Techniques, Fibreoptic
Resumo:
Acoustic emission (AE) analysis is one of the several diagnostic techniques available nowadays for structural health monitoring (SHM) of engineering structures. Some of its advantages over other techniques include high sensitivity to crack growth and capability of monitoring a structure in real time. The phenomenon of rapid release of energy within a material by crack initiation or growth in form of stress waves is known as acoustic emission (AE). In AE technique, these stress waves are recorded by means of suitable sensors placed on the surface of a structure. Recorded signals are subsequently analysed to gather information about the nature of the source. By enabling early detection of crack growth, AE technique helps in planning timely retrofitting or other maintenance jobs or even replacement of the structure if required. In spite of being a promising tool, some challenges do still exist behind the successful application of AE technique. Large amount of data is generated during AE testing, hence effective data analysis is necessary, especially for long term monitoring uses. Appropriate analysis of AE data for quantification of damage level is an area that has received considerable attention. Various approaches available for damage quantification for severity assessment are discussed in this paper, with special focus on civil infrastructure such as bridges. One method called improved b-value analysis is used to analyse data collected from laboratory testing.
Resumo:
In the medical and healthcare arena, patients‟ data is not just their own personal history but also a valuable large dataset for finding solutions for diseases. While electronic medical records are becoming popular and are used in healthcare work places like hospitals, as well as insurance companies, and by major stakeholders such as physicians and their patients, the accessibility of such information should be dealt with in a way that preserves privacy and security. Thus, finding the best way to keep the data secure has become an important issue in the area of database security. Sensitive medical data should be encrypted in databases. There are many encryption/ decryption techniques and algorithms with regard to preserving privacy and security. Currently their performance is an important factor while the medical data is being managed in databases. Another important factor is that the stakeholders should decide more cost-effective ways to reduce the total cost of ownership. As an alternative, DAS (Data as Service) is a popular outsourcing model to satisfy the cost-effectiveness but it takes a consideration that the encryption/ decryption modules needs to be handled by trustworthy stakeholders. This research project is focusing on the query response times in a DAS model (AES-DAS) and analyses the comparison between the outsourcing model and the in-house model which incorporates Microsoft built-in encryption scheme in a SQL Server. This research project includes building a prototype of medical database schemas. There are 2 types of simulations to carry out the project. The first stage includes 6 databases in order to carry out simulations to measure the performance between plain-text, Microsoft built-in encryption and AES-DAS (Data as Service). Particularly, the AES-DAS incorporates implementations of symmetric key encryption such as AES (Advanced Encryption Standard) and a Bucket indexing processor using Bloom filter. The results are categorised such as character type, numeric type, range queries, range queries using Bucket Index and aggregate queries. The second stage takes the scalability test from 5K to 2560K records. The main result of these simulations is that particularly as an outsourcing model, AES-DAS using the Bucket index shows around 3.32 times faster than a normal AES-DAS under the 70 partitions and 10K record-sized databases. Retrieving Numeric typed data takes shorter time than Character typed data in AES-DAS. The aggregation query response time in AES-DAS is not as consistent as that in MS built-in encryption scheme. The scalability test shows that the DBMS reaches in a certain threshold; the query response time becomes rapidly slower. However, there is more to investigate in order to bring about other outcomes and to construct a secured EMR (Electronic Medical Record) more efficiently from these simulations.
Resumo:
Navigational safety analysis relying on collision statistics is often hampered because of low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possesses great potential for managing collision risks in port waters.
Resumo:
Enterprise architecture (EA) management has become an intensively discussed approach to manage enterprise transformations. Despite the popularity and potential of EA, both researchers and practitioners lament a lack of knowledge about the realization of benefits from EA. To determine the benefits from EA, we explore the various dimensions of EA benefit realization and report on the development of a validated and robust measurement instrument. In this paper, we test the reliability and construct validity of the EA benefit realization model (EABRM), which we have designed based on the DeLone & McLean IS success model and findings from exploratory interviews. A confirmatory factor analysis confirms the existence of an impact of five distinct and individually important dimensions on the benefits derived from EA: EA artefact quality, EA infrastructure quality, EA service quality, EA culture, and EA use. The analysis presented in this paper shows that the EA benefit realization model is an instrument that demonstrates strong reliability and validity.
Resumo:
Qualitative and quantitative measurements of biomass components dissolved in the phosphonium ionic liquids (ILs), trihexyltetradecylphosphonium chloride ([P66614]Cl) and tributylmethylphosphonium methylsulphate ([P4441]MeSO 4), are obtained using attenuated total reflectance-FTIR. Absorption bands related to cellulose, hemicelluloses, and lignin dissolution monitored in situ in biomass-IL mixtures indicate lignin dissolution in both ILs and some holocellulose dissolution in the hydrophilic [P4441]MeSO 4. The kinetics of lignin dissolution reported here indicate that while dissolution in the hydrophobic IL [P66614]Cl appears to follow an accepted mechanism of acid catalyzed -aryl ether cleavage, dissolution in the hydrophilic IL [P4441]MeSO 4 does not appear to follow this mechanism and may not be followed by condensation reactions (initiated by reactive ketones). The measurement of lignin dissolution in phosphonium ILs based on absorbance at 1510 cm 1 has demonstrated utility. When coupled with the gravimetric Klason lignin method, ATR-FTIR study of reaction mixtures can lead to a better understanding of the delignification process. © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
Background: The accurate evaluation of physical activity levels amongst youth is critical for quantifying physical activity behaviors and evaluating the effect of physical activity interventions. The purpose of this review is to evaluate contemporary approaches to physical activity evaluation amongst youth. Data sources: The literature from a range of sources was reviewed and synthesized to provide an overview of contemporary approaches for measuring youth physical activity. Results: Five broad categories are described: self-report, instrumental movement detection, biological approaches, direct observation, and combined methods. Emerging technologies and priorities for future research are also identified. Conclusions: There will always be a trade-off between accuracy and available resources when choosing the best approach for measuring physical activity amongst youth. Unfortunately, cost and logistical challenges may prohibit the use of "gold standard" physical activity measurement approaches such as doubly labelled water. Other objective methods such as heart rate monitoring, accelerometry, pedometry, indirect calorimetry, or a combination of measures have the potential to better capture the duration and intensity of physical activity, while self-reported measures are useful for capturing the type and context of activity.
Resumo:
This paper investigates the use of the dimensionality-reduction techniques weighted linear discriminant analysis (WLDA), and weighted median fisher discriminant analysis (WMFD), before probabilistic linear discriminant analysis (PLDA) modeling for the purpose of improving speaker verification performance in the presence of high inter-session variability. Recently it was shown that WLDA techniques can provide improvement over traditional linear discriminant analysis (LDA) for channel compensation in i-vector based speaker verification systems. We show in this paper that the speaker discriminative information that is available in the distance between pair of speakers clustered in the development i-vector space can also be exploited in heavy-tailed PLDA modeling by using the weighted discriminant approaches prior to PLDA modeling. Based upon the results presented within this paper using the NIST 2008 Speaker Recognition Evaluation dataset, we believe that WLDA and WMFD projections before PLDA modeling can provide an improved approach when compared to uncompensated PLDA modeling for i-vector based speaker verification systems.
Resumo:
Differences in the NMR detectability of 39K in various excised rat tissues (liver, brain, kidney, muscle, and testes) have been observed. The lowest NMR detectability occurs for liver (61 ± 3% of potassium as measured by flame photometry) and highest for erythrocytes (100 ± 7%). These differences in detectability correlate with differences in the measured 39K NMR relaxation constants in the same tissues. 39K detectabilities were also found to correlate inversely with the mitochondrial content of the tissues. Mitochondria prepared from liver showed greatly reduced 39K NMR detectability when compared with the tissue from which it was derived, 31.6 ± 9% of potassium measured by flame photometry compared to 61 ± 3%. The detectability of potassium in mitochondria was too low to enable the measurement of relaxation constants. This study indicates that differences in tissue structure, particularly mitochondrial content are important in determining 39K detectability and measured relaxation rates.
Resumo:
Navigational collisions are one of the major safety concerns for many seaports. Continuing growth of shipping traffic in number and sizes is likely to result in increased number of traffic movements, which consequently could result higher risk of collisions in these restricted waters. This continually increasing safety concern warrants a comprehensive technique for modeling collision risk in port waters, particularly for modeling the probability of collision events and the associated consequences (i.e., injuries and fatalities). A number of techniques have been utilized for modeling the risk qualitatively, semi-quantitatively and quantitatively. These traditional techniques mostly rely on historical collision data, often in conjunction with expert judgments. However, these techniques are hampered by several shortcomings, such as randomness and rarity of collision occurrence leading to obtaining insufficient number of collision counts for a sound statistical analysis, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these shortcomings is the navigational traffic conflict technique (NTCT), which uses traffic conflicts as an alternative to the collisions for modeling the probability of collision events quantitatively. This article explores the existing techniques for modeling collision risk in port waters. In particular, it identifies the advantages and limitations of the traditional techniques and highlights the potentials of the NTCT in overcoming the limitations. In view of the principles of the NTCT, a structured method for managing collision risk is proposed. This risk management method allows safety analysts to diagnose safety deficiencies in a proactive manner, which consequently has great potential for managing collision risk in a fast, reliable and efficient manner.
Resumo:
This chapter reviews common barriers to community engagement for Latino youth and suggests ways to move beyond those barriers by empowering them to communicate their experiences, address the challenges they face, and develop recommendations for making their community more youth-friendly. As a case study, this chapter describes a program called Youth FACE IT (Youth Fostering Active Community Engagement for Integration and Transformation)in Boulder County, Colorado. The program enables Latino youth to engage in critical dialogue and participate in a community-based initiative. The chapter concludes by explaining specific strategies that planners can use to support active community engagement and develop a future generation of planners and engaged community members that reflects emerging demographics.
Resumo:
In this paper we consider the variable order time fractional diffusion equation. We adopt the Coimbra variable order (VO) time fractional operator, which defines a consistent method for VO differentiation of physical variables. The Coimbra variable order fractional operator also can be viewed as a Caputo-type definition. Although this definition is the most appropriate definition having fundamental characteristics that are desirable for physical modeling, numerical methods for fractional partial differential equations using this definition have not yet appeared in the literature. Here an approximate scheme is first proposed. The stability, convergence and solvability of this numerical scheme are discussed via the technique of Fourier analysis. Numerical examples are provided to show that the numerical method is computationally efficient. Crown Copyright © 2012 Published by Elsevier Inc. All rights reserved.
Resumo:
Data quality has become a major concern for organisations. The rapid growth in the size and technology of a databases and data warehouses has brought significant advantages in accessing, storing, and retrieving information. At the same time, great challenges arise with rapid data throughput and heterogeneous accesses in terms of maintaining high data quality. Yet, despite the importance of data quality, literature has usually condensed data quality into detecting and correcting poor data such as outliers, incomplete or inaccurate values. As a result, organisations are unable to efficiently and effectively assess data quality. Having an accurate and proper data quality assessment method will enable users to benchmark their systems and monitor their improvement. This paper introduces a granules mining for measuring the random degree of error data which will enable decision makers to conduct accurate quality assessment and allocate the most severe data, thereby providing an accurate estimation of human and financial resources for conducting quality improvement tasks.