929 resultados para Source analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Kallikrein 15 (KLK15)/Prostinogen is a plausible candidate for prostate cancer susceptibility. Elevated KLK15 expression has been reported in prostate cancer and it has been described as an unfavorable prognostic marker for the disease. Objectives: We performed a comprehensive analysis of association of variants in the KLK15 gene with prostate cancer risk and aggressiveness by genotyping tagSNPs, as well as putative functional SNPs identified by extensive bioinformatics analysis. Methods and Data Sources: Twelve out of 22 SNPs, selected on the basis of linkage disequilibrium pattern, were analyzed in an Australian sample of 1,011 histologically verified prostate cancer cases and 1,405 ethnically matched controls. Replication was sought from two existing genome wide association studies (GWAS): the Cancer Genetic Markers of Susceptibility (CGEMS) project and a UK GWAS study. Results: Two KLK15 SNPs, rs2659053 and rs3745522, showed evidence of association (p, 0.05) but were not present on the GWAS platforms. KLK15 SNP rs2659056 was found to be associated with prostate cancer aggressiveness and showed evidence of association in a replication cohort of 5,051 patients from the UK, Australia, and the CGEMS dataset of US samples. A highly significant association with Gleason score was observed when the data was combined from these three studies with an Odds Ratio (OR) of 0.85 (95% CI = 0.77-0.93; p = 2.7610 24). The rs2659056 SNP is predicted to alter binding of the RORalpha transcription factor, which has a role in the control of cell growth and differentiation and has been suggested to control the metastatic behavior of prostate cancer cells. Conclusions: Our findings suggest a role for KLK15 genetic variation in the etiology of prostate cancer among men of European ancestry, although further studies in very large sample sets are necessary to confirm effect sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although accountability in the form of high stakes testing is in favour in the contemporary Australian educational context, this practice remains a highly contested source of debate. Proponents for high stakes tests claim that higher standards in teaching and learning result from their implementation, whereas others believe that this type of testing regime is not required and may even in fact be counterproductive. Regardless of what side of the debate you sit on, the reality is that at present, high stakes testing appears to be here to stay. It could therefore be argued it is essential that teachers understand accountability and possess the specific skills to interpret and use test data beneficially.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acoustic emission (AE) is the phenomenon where stress waves are generated due to rapid release of energy within a material caused by sources such as crack initiation or growth. AE technique involves recording the stress waves by means of sensors and subsequent analysis of the recorded signals to gather information about the nature of the source. Though AE technique is one of the popular non destructive evaluation (NDE) techniques for structural health monitoring of mechanical, aerospace and civil structures; several challenges still exist in successful application of this technique. Presence of spurious noise signals can mask genuine damage‐related AE signals; hence a major challenge identified is finding ways to discriminate signals from different sources. Analysis of parameters of recorded AE signals, comparison of amplitudes of AE wave modes and investigation of uniqueness of recorded AE signals have been mentioned as possible criteria for source differentiation. This paper reviews common approaches currently in use for source discrimination, particularly focusing on structural health monitoring of civil engineering structural components such as beams; and further investigates the applications of some of these methods by analyzing AE data from laboratory tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structural health monitoring (SHM) refers to the procedure used to assess the condition of structures so that their performance can be monitored and any damage can be detected early. Early detection of damage and appropriate retrofitting will aid in preventing failure of the structure and save money spent on maintenance or replacement and ensure the structure operates safely and efficiently during its whole intended life. Though visual inspection and other techniques such as vibration based ones are available for SHM of structures such as bridges, the use of acoustic emission (AE) technique is an attractive option and is increasing in use. AE waves are high frequency stress waves generated by rapid release of energy from localised sources within a material, such as crack initiation and growth. AE technique involves recording these waves by means of sensors attached on the surface and then analysing the signals to extract information about the nature of the source. High sensitivity to crack growth, ability to locate source, passive nature (no need to supply energy from outside, but energy from damage source itself is utilised) and possibility to perform real time monitoring (detecting crack as it occurs or grows) are some of the attractive features of AE technique. In spite of these advantages, challenges still exist in using AE technique for monitoring applications, especially in the area of analysis of recorded AE data, as large volumes of data are usually generated during monitoring. The need for effective data analysis can be linked with three main aims of monitoring: (a) accurately locating the source of damage; (b) identifying and discriminating signals from different sources of acoustic emission and (c) quantifying the level of damage of AE source for severity assessment. In AE technique, the location of the emission source is usually calculated using the times of arrival and velocities of the AE signals recorded by a number of sensors. But complications arise as AE waves can travel in a structure in a number of different modes that have different velocities and frequencies. Hence, to accurately locate a source it is necessary to identify the modes recorded by the sensors. This study has proposed and tested the use of time-frequency analysis tools such as short time Fourier transform to identify the modes and the use of the velocities of these modes to achieve very accurate results. Further, this study has explored the possibility of reducing the number of sensors needed for data capture by using the velocities of modes captured by a single sensor for source localization. A major problem in practical use of AE technique is the presence of sources of AE other than crack related, such as rubbing and impacts between different components of a structure. These spurious AE signals often mask the signals from the crack activity; hence discrimination of signals to identify the sources is very important. This work developed a model that uses different signal processing tools such as cross-correlation, magnitude squared coherence and energy distribution in different frequency bands as well as modal analysis (comparing amplitudes of identified modes) for accurately differentiating signals from different simulated AE sources. Quantification tools to assess the severity of the damage sources are highly desirable in practical applications. Though different damage quantification methods have been proposed in AE technique, not all have achieved universal approval or have been approved as suitable for all situations. The b-value analysis, which involves the study of distribution of amplitudes of AE signals, and its modified form (known as improved b-value analysis), was investigated for suitability for damage quantification purposes in ductile materials such as steel. This was found to give encouraging results for analysis of data from laboratory, thereby extending the possibility of its use for real life structures. By addressing these primary issues, it is believed that this thesis has helped improve the effectiveness of AE technique for structural health monitoring of civil infrastructures such as bridges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of Six Sigma was initiated in the 1980s by Motorola. Since then it has been implemented in several manufacturing and service organizations. Till now Six Sigma implementation is mostly limited to healthcare and financial services in private sector. Its implementation is now gradually picking up in services such as call center, education, construction and related engineering etc. in private as well as public sector. Through a literature review, a questionnaire survey, and multiple case study approach the paper develops a conceptual framework to facilitate widening the scope of Six Sigma implementation in service organizations. Using grounded theory methodology, this study develops theory for Six Sigma implementation in service organizations. The study involves a questionnaire survey and case studies to understand and build a conceptual framework. The survey was conducted in service organizations in Singapore and exploratory in nature. The case studies involved three service organizations which implemented Six Sigma. The objective is to explore and understand the issues highlighted by the survey and the literature. The findings confirm the inclusion of critical success factors, critical-to-quality characteristics, and set of tools and techniques as observed from the literature. In case of key performance indicator, there are different interpretations about it in literature and also by industry practitioners. Some literature explain key performance indicator as performance metrics whereas some feel it as key process input or output variables, which is similar to interpretations by practitioners of Six Sigma. The response of not relevant and unknown to us as reasons for not implementing Six Sigma shows the need for understanding specific requirements of service organizations. Though much theoretical description is available about Six Sigma, but there has been limited rigorous academic research on it. This gap is far more pronounced about Six Sigma implementation in service organizations, where the theory is not mature enough. Identifying this need, the study contributes by going through theory building exercise and developing a conceptual framework to understand the issues involving its implementation in service organizations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric deposition is one of the most important pathways of urban stormwater pollution. Atmospheric deposition which can be in the form of either wet or dry deposition have distinct characteristics in terms of associated particulate sizes, pollutant types and influential parameters. This paper discusses the outcomes of a comprehensive research study undertaken to identify important traffic characteristics and climate factors such as antecedent dry period and rainfall characteristics which influences the characteristics of wet and dry deposition of solids and heavy metals. The outcomes confirmed that Zinc (Zn) is correlated with traffic volume whereas Lead (Pb), Cadmium (Cd), Nickel (Ni), and Copper (Cu) are correlated with traffic congestion. Consequently, reducing traffic congestion will be more effective than reducing traffic volume for improving air quality particularly in relation to Pb, Cd, Ni, and Cu. Zn was found to have the highest atmospheric deposition rate compared to other heavy metals. Zn in dry deposition is associated with relatively larger particle size fractions (>10 µm), whereas Pb, Cd, Ni and Cu are associated with relatively smaller particle size fractions (<10 µm). The analysis further revealed that bulk (wet plus dry) deposition which is correlated with rainfall depth and contains a relatively higher percentage of smaller particles compared to dry deposition which is correlated with the antecedent dry period. As particles subjected to wet deposition are smaller, they disperse over a larger area from the source of origin compared to particles subjected to dry deposition as buoyancy forces become dominant for smaller particles compared to the influence of gravity. Furthermore, exhaust emission particles were found to be primarily associated with bulk deposition compared to dry deposition particles which mainly originate from vehicle component wear.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A body of research in conversation analysis has identified a range of structurally-provided positions in which sources of trouble in talk-in-interaction can be addressed using repair. These practices are contained within what Schegloff (1992) calls the repair space. In this paper, I examine a rare instance in which a source of trouble is not resolved within the repair space and comes to be addressed outside of it. The practice by which this occurs is a post-completion account; that is, an account that is produced after the possible completion of the sequence containing a source of trouble. Unlike fourth position repair, the final repair position available within the repair space, this account is not made in preparation for a revised response to the trouble-source turn. Its more restrictive aim, rather, is to circumvent an ongoing difference between the parties involved. I argue that because the trouble is addressed in this manner, and in this particular position, the repair space can be considered as being limited to the sequence in which a source of trouble originates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

None of currently used tonometers produce estimated IOP values that are free of errors. Measurement incredibility arises from indirect measurement of corneal deformation and the fact that pressure calculations are based on population averaged parameters of anterior segment. Reliable IOP values are crucial for understanding and monitoring of number of eye pathologies e.g. glaucoma. We have combined high speed swept source OCT with air-puff chamber. System provides direct measurement of deformation of cornea and anterior surface of the lens. This paper describes in details the performance of air-puff ssOCT instrument. We present different approaches of data presentation and analysis. Changes in deformation amplitude appears to be good indicator of IOP changes. However, it seems that in order to provide accurate intraocular pressure values an additional information on corneal biomechanics is necessary. We believe that such information could be extracted from data provided by air-puff ssOCT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As e-commerce is becoming more and more popular, the number of customer reviews that a product receives grows rapidly. In order to enhance customer satisfaction and their shopping experiences, it has become important to analysis customers reviews to extract opinions on the products that they buy. Thus, Opinion Mining is getting more important than before especially in doing analysis and forecasting about customers’ behavior for businesses purpose. The right decision in producing new products or services based on data about customers’ characteristics means profit for organization/company. This paper proposes a new architecture for Opinion Mining, which uses a multidimensional model to integrate customers’ characteristics and their comments about products (or services). The key step to achieve this objective is to transfer comments (opinions) to a fact table that includes several dimensions, such as, customers, products, time and locations. This research presents a comprehensive way to calculate customers’ orientation for all possible products’ attributes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Secure communications in distributed Wireless Sensor Networks (WSN) operating under adversarial conditions necessitate efficient key management schemes. In the absence of a priori knowledge of post-deployment network configuration and due to limited resources at sensor nodes, key management schemes cannot be based on post-deployment computations. Instead, a list of keys, called a key-chain, is distributed to each sensor node before the deployment. For secure communication, either two nodes should have a key in common in their key-chains, or they should establish a key through a secure-path on which every link is secured with a key. We first provide a comparative survey of well known key management solutions for WSN. Probabilistic, deterministic and hybrid key management solutions are presented, and they are compared based on their security properties and re-source usage. We provide a taxonomy of solutions, and identify trade-offs in them to conclude that there is no one size-fits-all solution. Second, we design and analyze deterministic and hybrid techniques to distribute pair-wise keys to sensor nodes before the deployment. We present novel deterministic and hybrid approaches based on combinatorial design theory and graph theory for deciding how many and which keys to assign to each key-chain before the sensor network deployment. Performance and security of the proposed schemes are studied both analytically and computationally. Third, we address the key establishment problem in WSN which requires key agreement algorithms without authentication are executed over a secure-path. The length of the secure-path impacts the power consumption and the initialization delay for a WSN before it becomes operational. We formulate the key establishment problem as a constrained bi-objective optimization problem, break it into two sub-problems, and show that they are both NP-Hard and MAX-SNP-Hard. Having established inapproximability results, we focus on addressing the authentication problem that prevents key agreement algorithms to be used directly over a wireless link. We present a fully distributed algorithm where each pair of nodes can establish a key with authentication by using their neighbors as the witnesses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective The aim of this study was to examine the prevalence of overweight and obesity and the association with demographic, reproductive work variables in a representative cohort of working nurses and midwives. Design A cross sectional study of self reported survey data. Settings Australia, New Zealand and the United Kingdom. Methods Measurement outcomes included BMI categories, demographic (age, gender, marital status, ethnicity), reproductive (parity, number of births, mother's age at first birth, birth type and menopausal status) and workforce (registration council, employment type and principal specialty) variables. Participants 4996 respondents to the Nurses and Midwives e-Cohort study who were currently registered and working in nursing or midwifery in Australia (n=3144), New Zealand (n=778) or the United Kingdom (n=1074). Results Amongst the sample 61.87% were outside the healthy weight range and across all three jurisdictions the prevalence of obesity in nurses and midwives exceeded rates in the source populations by 1.73% up to 3.74%. Being overweight or obese was significantly associated with increasing age (35–44 yrs aOR 1.71, 95% CI 1.41–2.08; 45–55 yrs aOR 1.90, 95%CI 1.56–2.31; 55–64 aOR 2.22, 95% CI 1.71–2.88), and male gender (aOR 1.46, 95% CI 1.15–1.87). Primiparous nurses and midwives were more likely to be overweight or obese (aOR 1.37, 95% CI 1.06–1.76) as were those who had reached menopause (aOR 1.37, 95% CI 1.11–1.69). Nurses and midwives in part-time or casual employment had significantly reduced risk of being overweight or obese, (aOR 0.81, 95% CI 0.70–0.94 and aOR 0.75, 95% CI 0.59–0.96 respectively), whilst working in aged carried increased risk (aOR 1.37, 95% CI 1.04–1.80). Conclusion Nurses and midwives in this study have higher prevalence of obesity and overweight than the general population and those who are older, male, or female primiparous and menopausal have significantly higher risk of overweight or obesity as do those working fulltime, or in aged care. The consequences of overweight and obesity in this occupational group may impact on their workforce participation, their management of overweight and obese patients in their care as well as influencing their individual health behaviours and risks of occupational injury and chronic disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this systematic review was to examine the effect of Contrast Water Therapy (CWT) on recovery following exercise induced muscle damage. Controlled trials were identified from computerized literature searching and citation tracking performed up to February 2013. Eighteen trials met the inclusion criteria; all had a high risk of bias. Pooled data from 13 studies showed that CWT resulted in significantly greater improvements in muscle soreness at the five follow-up time points(<6, 24, 48, 72 and 96 hours) in comparison to passive recovery. Pooled data also showed that CWT significantly reduced muscle strength loss at each follow-up time (<6, 24, 48, 72 and 96 hours) in comparison to passive recovery. Despite comparing CWT to a large number of other recovery interventions, including cold water immersion, warm water immersion, compression, active recovery and stretching, there was little evidence for a superior treatment intervention. The current evidence base shows that CWT is superior to using passive recovery or rest after exercise; the magnitudes of these effects may be most relevant to an elite sporting population. There seems to be little difference in recovery outcome between CWT and other popular recovery interventions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long term exposure to vehicle emissions has been associated with harmful health effects. Children are amongst the most susceptible group and schools represent an environment where they can experience significant exposure to vehicle emissions. However, there are limited studies on children’s exposure to vehicle emissions in schools. The aim of this study was to quantify the concentration of organic aerosol and in particular, vehicle emissions that children are exposed to during school hours. Therefore an Aerodyne compact time-of-flight aerosol mass spectrometer (TOF-AMS) was deployed at five urban schools in Brisbane, Australia. The TOF-AMS enabled the chemical composition of the non- refractory (NR-PM1) to be analysed with a high temporal resolution to assess the concentration of vehicle emissions and other organic aerosols during school hours. At each school the organic fraction comprised the majority of NR-PM1 with secondary organic aerosols as the main constitute. At two of the schools, a significant source of the organic aerosol (OA) was slightly aged vehicle emissions from nearby highways. More aged and oxidised OA was observed at the other three schools, which also recorded strong biomass burning influences. Primary emissions were found to dominate the OA at only one school which had an O:C ratio of 0.17, due to fuel powered gardening equipment used near the TOF-AMS. The diurnal cycle of OA concentration varied between schools and was found to be at a minimum during school hours. The major organic component that school children were exposed to during school hours was secondary OA. Peak exposure of school children to HOA occurred during school drop off and pick up times. Unless a school is located near major roads, children are exposed predominately to regional secondary OA as opposed to local emissions during schools hours in urban environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the idea of a compendium of process technologies, i.e., a concise but comprehensive collection of techniques for process model analysis that support research on the design, execution, and evaluation of processes. The idea originated from observations on the evolution of process-related research disciplines. Based on these observations, we derive design goals for a compendium. Then, we present the jBPT library, which addresses these goals by means of an implementation of common analysis techniques in an open source codebase.