705 resultados para QC


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Portland cement concrete (PCC) pavement undergoes repeated environmental load-related deflection resulting from temperature and moisture variations across the pavement depth. This phenomenon, referred to as PCC pavement curling and warping, has been known and studied since the mid-1920s. Slab curvature can be further magnified under repeated traffic loads and may ultimately lead to fatigue failures, including top-down and bottom-up transverse, longitudinal, and corner cracking. It is therefore important to measure the “true” degree of curling and warping in PCC pavements, not only for quality control (QC) and quality assurance (QA) purposes, but also to achieve a better understanding of its relationship to long-term pavement performance. In order to better understand the curling and warping behavior of PCC pavements in Iowa and provide recommendations to mitigate curling and warping deflections, field investigations were performed at six existing sites during the late fall of 2015. These sites included PCC pavements with various ages, slab shapes, mix design aspects, and environmental conditions during construction. A stationary light detection and ranging (LiDAR) device was used to scan the slab surfaces. The degree of curling and warping along the longitudinal, transverse, and diagonal directions was calculated for the selected slabs based on the point clouds acquired using LiDAR. The results and findings are correlated to variations in pavement performance, mix design, pavement design, and construction details at each site. Recommendations regarding how to minimize curling and warping are provided based on a literature review and this field study. Some examples of using point cloud data to build three-dimensional (3D) models of the overall curvature of the slab shape are presented to show the feasibility of using this 3D analysis method for curling and warping analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hexagonal Resonant Triad patterns are shown to exist as stable solutions of a particular type of nonlinear field where no cubic field nonlinearity is present. The zero ‘dc’ Fourier mode is shown to stabilize these patterns produced by a pure quadratic field nonlinearity. Closed form solutions and stability results are obtained near the critical point, complimented by numerical studies far from the critical point. These results are obtained using a neural field based on the Helmholtzian operator. Constraints on structure and parameters for a general pure quadratic neural field which supports hexagonal patterns are obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This was a short presentation provided for the panel session on 'HDR' at CineFest16, which discusses/challenges some basic assumptions and questions about High Dynamic Range (HDR) movies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We give a relativistic spin network model for quantum gravity based on the Lorentz group and its q-deformation, the Quantum Lorentz Algebra. We propose a combinatorial model for the path integral given by an integral over suitable representations of this algebra. This generalises the state sum models for the case of the four-dimensional rotation group previously studied in gr-qc/9709028. As a technical tool, formulae for the evaluation of relativistic spin networks for the Lorentz group are developed, with some simple examples which show that the evaluation is finite in interesting cases. We conjecture that the `10J' symbol needed in our model has a finite value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mammography equipment must be evaluated to ensure that images will be of acceptable diagnostic quality with lowest radiation dose. Quality Assurance (QA) aims to provide systematic and constant improvement through a feedback mechanism to address the technical, clinical and training aspects. Quality Control (QC), in relation to mammography equipment, comprises a series of tests to determine equipment performance characteristics. The introduction of digital technologies promoted changes in QC tests and protocols and there are some tests that are specific for each manufacturer. Within each country specifi c QC tests should be compliant with regulatory requirements and guidance. Ideally, one mammography practitioner should take overarching responsibility for QC within a service, with all practitioners having responsibility for actual QC testing. All QC results must be documented to facilitate troubleshooting, internal audit and external assessment. Generally speaking, the practitioner’s role includes performing, interpreting and recording the QC tests as well as reporting any out of action limits to their service lead. They must undertake additional continuous professional development to maintain their QC competencies. They are usually supported by technicians and medical physicists; in some countries the latter are mandatory. Technicians and/or medical physicists often perform many of the tests indicated within this chapter. It is important to recognise that this chapter is an attempt to encompass the main tests performed within European countries. Specific tests related to the service that you work within must be familiarised with and adhered too.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introdução: A adolescência, enquanto período de desenvolvimento favorável ao estabelecimento das primeiras relações amorosas, é propícia à construção de atitudes sobre a intimidade, mas também às primeiras manifestações de poder e controlo nestas relações (Leitão et al., 2013). A avaliação dos conhecimentos sobre o fenómeno é o primeiro passo para a implementação de programas que visem a construção de competências promotoras a vivência de relações de intimidade felizes e saudáveis. Objetivos: Apresentar a avaliação das propriedades psicométricas do Questionário de Conhecimentos sobre Violência nas Relações de Intimidade (QC-VRI) que permite medir os conhecimentos que os adolescentes detêm sobre este fenómeno. Metodologia: Estudo de validação de um instrumento de medida. O estudo foi desenvolvido em duas fases: a elaboração do questionário de 47 itens sobre causas, consequências e frequência da ocorrência de violência no namoro e a avaliação das suas propriedades psicométricas. Para avaliação das propriedades psicométricas o questionário foi aplicado a uma amostra de 465 adolescentes e jovens portugueses estudantes do ensino secundário e superior, sendo 81,5% do sexo feminino e 18,5% do masculino. A média da idade é de 17,91 anos. Resultados: O teste de adequabilidade da amostra apresentou um valor superior a 0,6 (KMO = 0,655), suportando a adequação da matriz de correlações. Por sua vez o teste de esfericidade de Bartlett's é significativo para um nível de significância de 5 % (p < 0,001). O QC-VRI apresenta índices de fiabilidade bons (> 0,70) e uma estrutura fatorial que é consistente com os princípios que presidiram ao seu desenvolvimento, sendo constituído por 21 itens que se estruturam numa solução de 5 fatores que explicam 47,33% da variância. Conclusões: O questionário poderá ser aplicado quer como medida de rastreio dos conhecimentos, quer como medida de avaliação do impacto das intervenções de sensibilização ou de formação sobre violência nas relações de intimidade nos adolescentes e jovens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the construction of operational oceanography systems, the need for real-time has become more and more important. A lot of work had been done in the past, within National Data Centres (NODC) and International Oceanographic Data and Information Exchange (IODE) to standardise delayed mode quality control procedures. Concerning such quality control procedures applicable in real-time (within hours to a maximum of a week from acquisition), which means automatically, some recommendations were set up for physical parameters but mainly within projects without consolidation with other initiatives. During the past ten years the EuroGOOS community has been working on such procedures within international programs such as Argo, OceanSites or GOSUD, or within EC projects such as Mersea, MFSTEP, FerryBox, ECOOP, and MyOcean. In collaboration with the FP7 SeaDataNet project that is standardizing the delayed mode quality control procedures in NODCs, and MyOcean GMES FP7 project that is standardizing near real time quality control procedures for operational oceanography purposes, the DATA-MEQ working group decided to put together this document to summarize the recommendations for near real-time QC procedures that they judged mature enough to be advertised and recommended to EuroGOOS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research investigates the feasibility of using web-based project management systems for dredging. To achieve this objective the research assessed both the positive and negative aspects of using web-based technology for the management of dredging projects. Information gained from literature review and prior investigations of dredging projects revealed that project performance, social, political, technical, and business aspects of the organization were important factors in deciding to use web-based systems for the management of dredging projects. These factors were used to develop the research assumptions. An exploratory case study methodology was used to gather the empirical evidence and perform the analysis. An operational prototype of the system was developed to help evaluate developmental and functional requirements, as well as the influence on performance, and on the organization. The evidence gathered from three case study projects, and from a survey of 31 experts, were used to validate the assumptions. Baselines, representing the assumptions, were created as a reference to assess the responses and qualitative measures. The deviation of the responses was used to evaluate for the analysis. Finally, the conclusions were assessed by validating the assumptions with the evidence, derived from the analysis. The research findings are as follows: 1. The system would help improve project performance. 2. Resistance to implementation may be experienced if the system is implemented. Therefore, resistance to implementation needs to be investigated further and more R&D work is needed in order to advance to the final design and implementation. 3. System may be divided into standalone modules in order to simplify the system and facilitate incremental changes. 4. The QA/QC conceptual approach used by this research needs to be redefined during future R&D to satisfy both owners and contractors. Yin (2009) Case Study Research Design and Methods was used to develop the research approach, design, data collection, and analysis. Markus (1983) Resistance Theory was used during the assumptions definition to predict potential problems to the implementation of web-based project management systems for the dredging industry. Keen (1981) incremental changes and facilitative approach tactics were used as basis to classify solutions, and how to overcome resistance to implementation of the web-based project management system. Davis (1989) Technology Acceptance Model (TAM) was used to assess the solutions needed to overcome the resistances to the implementation of web-base management systems for dredging projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantum mechanics, optics and indeed any wave theory exhibits the phenomenon of interference. In this thesis we present two problems investigating interference due to indistinguishable alternatives and a mostly unrelated investigation into the free space propagation speed of light pulses in particular spatial modes. In chapter 1 we introduce the basic properties of the electromagnetic field needed for the subsequent chapters. In chapter 2 we review the properties of interference using the beam splitter and the Mach-Zehnder interferometer. In particular we review what happens when one of the paths of the interferometer is marked in some way so that the particle having traversed it contains information as to which path it went down (to be followed up in chapter 3) and we review Hong-Ou-Mandel interference at a beam splitter (to be followed up in chapter 5). In chapter 3 we present the first of the interference problems. This consists of a nested Mach-Zehnder interferometer in which each of the free space propagation segments are weakly marked by mirrors vibrating at different frequencies [1]. The original experiment drew the conclusions that the photons followed disconnected paths. We partition the description of the light in the interferometer according to the number of paths it contains which-way information about and reinterpret the results reported in [1] in terms of the interference of paths spatially connected from source to detector. In chapter 4 we briefly review optical angular momentum, entanglement and spontaneous parametric down conversion. These concepts feed into chapter 5 in which we present the second of the interference problems namely Hong-Ou-Mandel interference with particles possessing two degrees of freedom. We analyse the problem in terms of exchange symmetry for both boson and fermion pairs and show that the particle statistics at a beam splitter can be controlled for suitably chosen states. We propose an experimental test of these ideas using orbital angular momentum entangled photons. In chapter 6 we look at the effect that the transverse spatial structure of the mode that a pulse of light is excited in has on its group velocity. We show that the resulting group velocity is slower than the speed of light in vacuum for plane waves and that this reduction in the group velocity is related to the spread in the wave vectors required to create the transverse spatial structure. We present experimental results of the measurement of this slowing down using Hong-Ou-Mandel interference.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Little is known about historic wood as it ages naturally. Instead, most studies focus on biological decay, as it is often assumed that wood remains otherwise stable with age. This PhD project was organised by Historic Scotland and the University of Glasgow to investigate the natural chemical and physical aging of wood. The natural aging of wood was a concern for Historic Scotland as traditional timber replacement is the standard form of repair used in wooden cultural heritage; replacing rotten timber with new timber of the same species. The project was set up to look at what differences could exist both chemically and physically between old and new wood, which could put unforeseen stress on the joint between them. Through Historic Scotland it was possible to work with genuine historic wood from two species, Oak and Scots pine, both from the 1500’s, rather than relying on artificial aging. Artificial aging of wood is still a debated topic, with consideration given to whether it is truly mimicking the aging process or just damaging the wood cells. The chemical stability of wood was investigated using Fourier-transform infrared (FTIR) microscopy, as well as wet chemistry methods including a test for soluble sugars from the possible breakdown of the wood polymers. The physical properties assessed included using a tensile testing machine to uncover possible differences in mechanical properties. An environmental chamber was used to test the reaction to moisture of wood of different ages, as moisture is the most damaging aspect of the environment to wooden cultural objects. The project uncovered several differences, both physical and chemical, between the modern and historic wood which could affect the success of traditional ‘like for like’ repairs. Both oak and pine lost acetyl groups, over historic time, from their hemicellulose polymers. This chemical reaction releases acetic acid, which had no effect on the historic oak but was associated with reduced stiffness in historic pine, probably due to degradation of the hemicellulose polymers by acid hydrolysis. The stiffness of historic oak and pine was also reduced by decay. Visible pest decay led to loss of wood density but there was evidence that fungal decay, extending beyond what was visible, degraded the S2 layer of the pine cell walls, reducing the stiffness of the wood by depleting the cellulose microfibrils most aligned with the grain. Fungal decay of polysaccharides in pine wood left behind sugars that attracted increased levels of moisture. The degradation of essential polymers in the wood structure due to age had different impacts on the two species of wood, and raised questions concerning both the mechanism of aging of wood and the ways in which traditional repairs are implemented, especially in Scots pine. These repairs need to be done with more care and precision, especially in choosing new timber to match the old. Within this project a quantitative method of measuring the microfibril angle (MFA) of wood using polarised Fourier transform infrared (FTIR) microscopy has been developed, allowing the MFA of both new and historic pine to be measured. This provides some of the information needed for a more specific match when selecting replacement timbers for historic buildings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generalised refraction is a topic which has, thus far, garnered far less attention than it deserves. The purpose of this thesis is to highlight the potential that generalised refraction has to offer with regards to imaging and its application to designing new passive optical devices. Specifically in this thesis we will explore two types of gener- alised refraction which takes place across a planar interface: refraction by generalised confocal lenslet arrays (gCLAs), and refraction by ray-rotation sheets. We will show that the corresponding laws of refraction for these interfaces produce, in general, light-ray fields with non-zero curl, and as such do not have a corresponding outgoing waveform. We will then show that gCLAs perform integral, geometrical imaging, and that this enables them to be considered as approximate realisations of metric tensor interfaces. The concept of piecewise transformation optics will be introduced and we will show that it is possible to use gCLAs along with other optical elements such as lenses to design simple piecewise transformation-optics devices such as invisibility cloaks and insulation windows. Finally, we shall show that ray-rotation sheets can be interpreted as performing geometrical imaging into complex space, and that as a consequence, ray-rotation sheets and gCLAs may in fact be more closely related than first realised. We conclude with a summary of potential future projects which lead naturally from the results of this thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hypertension is a major risk factor for cardiovascular disease and mortality, and a growing global public health concern, with up to one-third of the world’s population affected. Despite the vast amount of evidence for the benefits of blood pressure (BP) lowering accumulated to date, elevated BP is still the leading risk factor for disease and disability worldwide. It is well established that hypertension and BP are common complex traits, where multiple genetic and environmental factors contribute to BP variation. Furthermore, family and twin studies confirmed the genetic component of BP, with a heritability estimate in the range of 30-50%. Contemporary genomic tools enabling the genotyping of millions of genetic variants across the human genome in an efficient, reliable, and cost-effective manner, has transformed hypertension genetics research. This is accompanied by the presence of international consortia that have offered unprecedentedly large sample sizes for genome-wide association studies (GWASs). While GWAS for hypertension and BP have identified more than 60 loci, variants in these loci are associated with modest effects on BP and in aggregate can explain less than 3% of the variance in BP. The aims of this thesis are to study the genetic and environmental factors that influence BP and hypertension traits in the Scottish population, by performing several genetic epidemiological analyses. In the first part of this thesis, it aims to study the burden of hypertension in the Scottish population, along with assessing the familial aggregation and heritialbity of BP and hypertension traits. In the second part, it aims to validate the association of common SNPs reported in the large GWAS and to estimate the variance explained by these variants. In this thesis, comprehensive genetic epidemiology analyses were performed on Generation Scotland: Scottish Family Health Study (GS:SFHS), one of the largest population-based family design studies. The availability of clinical, biological samples, self-reported information, and medical records for study participants has allowed several assessments to be performed to evaluate factors that influence BP variation in the Scottish population. Of the 20,753 subjects genotyped in the study, a total of 18,470 individuals (grouped into 7,025 extended families) passed the stringent quality control (QC) criteria and were available for all subsequent analysis. Based on the BP-lowering treatment exposure sources, subjects were further classified into two groups. First, subjects with both a self-reported medications (SRMs) history and electronic-prescription records (EPRs; n =12,347); second, all the subjects with at least one medication history source (n =18,470). In the first group, the analysis showed a good concordance between SRMs and EPRs (kappa =71%), indicating that SRMs can be used as a surrogate to assess the exposure to BP-lowering medication in GS:SFHS participants. Although both sources suffer from some limitations, SRMs can be considered the best available source to estimate the drug exposure history in those without EPRs. The prevalence of hypertension was 40.8% with higher prevalence in men (46.3%) compared to women (35.8%). The prevalence of awareness, treatment and controlled hypertension as defined by the study definition were 25.3%, 31.2%, and 54.3%, respectively. These findings are lower than similar reported studies in other populations, with the exception of controlled hypertension prevalence, which can be considered better than other populations. Odds of hypertension were higher in men, obese or overweight individuals, people with a parental history of hypertension, and those living in the most deprived area of Scotland. On the other hand, deprivation was associated with higher odds of treatment, awareness and controlled hypertension, suggesting that people living in the most deprived area may have been receiving better quality of care, or have higher comorbidity levels requiring greater engagement with doctors. These findings highlight the need for further work to improve hypertension management in Scotland. The family design of GS:SFHS has allowed family-based analysis to be performed to assess the familial aggregation and heritability of BP and hypertension traits. The familial correlation of BP traits ranged from 0.07 to 0.20, and from 0.18 to 0.34 for parent-offspring pairs and sibling pairs, respectively. A higher correlation of BP traits was observed among first-degree relatives than other types of relative pairs. A variance-component model that was adjusted for sex, body mass index (BMI), age, and age-squared was used to estimate heritability of BP traits, which ranged from 24% to 32% with pulse pressure (PP) having the lowest estimates. The genetic correlation between BP traits showed a high correlation between systolic (SBP), diastolic (DBP) and mean arterial pressure (MAP) (G: 81% to 94%), but lower correlations with PP (G: 22% to 78%). The sibling recurrence risk ratio (λS) for hypertension and treatment were calculated as 1.60 and 2.04 respectively. These findings confirm the genetic components of BP traits in GS:SFHS, and justify further work to investigate genetic determinants of BP. Genetic variants reported in the recent large GWAS of BP traits were selected for genotyping in GS:SFHS using a custom designed TaqMan® OpenArray®. The genotyping plate included 44 single nucleotide polymorphisms (SNPs) that have been previously reported to be associated with BP or hypertension at genome-wide significance level. A linear mixed model that is adjusted for age, age-squared, sex, and BMI was used to test for the association between the genetic variants and BP traits. Of the 43 variants that passed the QC, 11 variants showed statistically significant association with at least one BP trait. The phenotypic variance explained by these variant for the four BP traits were 1.4%, 1.5%, 1.6%, and 0.8% for SBP, DBP, MAP, and PP, respectively. The association of genetic risk score (GRS) that were constructed from selected variants has showed a positive association with BP level and hypertension prevalence, with an average effect of one mmHg increase with each 0.80 unit increases in the GRS across the different BP traits. The impact of BP-lowering medication on the genetic association study for BP traits has been established, with typical practice of adding a fixed value (i.e. 15/10 mmHg) to the measured BP values to adjust for BP treatment. Using the subset of participants with the two treatment exposure sources (i.e. SRMs and EPRs), the influence of using either source to justify the addition of fixed values in SNP association signal was analysed. BP phenotypes derived from EPRs were considered the true phenotypes, and those derived from SRMs were considered less accurate, with some phenotypic noise. Comparing SNPs association signals between the four BP traits in the two model derived from the different adjustments showed that MAP was the least impacted by the phenotypic noise. This was suggested by identifying the same overlapped significant SNPs for the two models in the case of MAP, while other BP traits had some discrepancy between the two sources

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since it has been found that the MadGraph Monte Carlo generator offers superior flavour-matching capability as compared to Alpgen, the suitability of MadGraph for the generation of ttb¯ ¯b events is explored, with a view to simulating this background in searches for the Standard Model Higgs production and decay process ttH, H ¯ → b ¯b. Comparisons are performed between the output of MadGraph and that of Alpgen, showing that satisfactory agreement in their predictions can be obtained with the appropriate generator settings. A search for the Standard Model Higgs boson, produced in association with the top quark and decaying into a b ¯b pair, using 20.3 fb−1 of 8 TeV collision data collected in 2012 by the ATLAS experiment at CERN’s Large Hadron Collider, is presented. The GlaNtp analysis framework, together with the RooFit package and associated software, are used to obtain an expected 95% confidence-level limit of 4.2 +4.1 −2.0 times the Standard Model expectation, and the corresponding observed limit is found to be 5.9; this is within experimental uncertainty of the published result of the analysis performed by the ATLAS collaboration. A search for a heavy charged Higgs boson of mass mH± in the range 200 ≤ mH± /GeV ≤ 600, where the Higgs mediates the five-flavour beyond-theStandard-Model physics process gb → tH± → ttb, with one top quark decaying leptonically and the other decaying hadronically, is presented, using the 20.3 fb−1 8 TeV ATLAS data set. Upper limits on the product of the production cross-section and the branching ratio of the H± boson are computed for six mass points, and these are found to be compatible within experimental uncertainty with those obtained by the corresponding published ATLAS analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Droplet microfluidics is an active multidisciplinary area of research that evolved out of the larger field of microfluidics. It enables the user to handle, process and manipulate micrometer-sized emulsion droplets on a micro- fabricated platform. The capability to carry out a large number of individual experiments per unit time makes the droplet microfluidic technology an ideal high-throughput platform for analysis of biological and biochemical samples. The objective of this thesis was to use such a technology for designing systems with novel implications in the newly emerging field of synthetic biology. Chapter 4, the first results chapter, introduces a novel method of droplet coalescence using a flow-focusing capillary device. In Chapter 5, the development of a microfluidic platform for the fabrication of a cell-free micro-environment for site-specific gene manipulation and protein expression is described. Furthermore, a novel fluorescent reporter system which functions both in vivo and in vitro is introduced in this chapter. Chapter 6 covers the microfluidic fabrication of polymeric vesicles from poly(2-methyloxazoline-b-dimethylsiloxane-b-2-methyloxazoline) tri-block copolymer. The polymersome made from this polymer was used in the next Chapter for the study of a chimeric membrane protein called mRFP1-EstA∗. In Chapter 7, the application of microfluidics for the fabrication of synthetic biological membranes to recreate artificial cell- like chassis structures for reconstitution of a membrane-anchored protein is described.