937 resultados para Spermatic quality analysis
Resumo:
A design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the process control of micro and nano-electronics based manufacturing processes is presented in this paper. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to help understand how a pre-defined geometry of micro- and nano- structures can be achieved using this technology. The process performance is characterised on the basis of developed Reduced Order Models (ROM) and are generated using results from a mathematical model of the Focused Ion Beam and Design of Experiment (DoE) methods. Two ion beam sources, Argon and Gallium ions, have been used to compare and quantify the process variable uncertainties that can be observed during the milling process. The evaluations of the process performance takes into account the uncertainties and variations of the process variables and are used to identify their impact on the reliability and quality of the fabricated structure. An optimisation based design task is to identify the optimal process conditions, by varying the process variables, so that certain quality objectives and requirements are achieved and imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.
Resumo:
Despite being exposed to the harsh sea-spray environment of the North Sea at Arbroath, Scotland, for over 63 years, many of the reinforced concrete precast beam elements of the 1.5 km long promenade railing are still in very good condition and show little evidence of reinforcement corrosion. In contrast, railing replacements constructed in about 1968 and in 1993 are almost all badly cracked as a result of extensive corrosion of the longitudinal reinforcement. This is despite the newer concrete appearing to be of better quality than the 1943 concrete. Statistics for maximum crack width for each of the three populations, based on measurements made in 2004 and in 2006, are presented. In situ and laboratory measurements show that the 1943 concrete appears to have high permeability but it also shows high electrical resistivity. Chloride penetration measurements show the 1943 and 1993 concretes to have similar chloride profiles and similar chloride concentrations at the reinforcement bars. This is inconsistent with the 1943 beams showing much less reinforcement corrosion than their later replacements and casts doubt on the conventional practice for durability design focusing on reducing concrete permeability through denser concretes or greater cover.
Resumo:
In the past 15 years in the UK, the state has acquired powers, which mark a qualitative shift in its relationship to higher education. Since the introduction and implementation of the Further and Higher Education Act 1992, the Teaching and Higher Education Act 1998 and the Higher Education Act 2004, a whole raft of changes have occurred which include the following: Widening participation; the development of interdisciplinary, experiential and workplace-based learning focused on a theory-practice dialogue; quality assurance; and new funding models which encompass public and private partnerships. The transformation of higher education can be placed in the context of New Labour’s overall strategies for overarching reform of public services, as set out in the Prime Minister’s Strategy Unit’s discussion paper The UK Government’s Approach to Public Service Reform (2006). An optimistic view of changes to higher education is that they simultaneously obey democratic and economic imperatives. There is an avowed commitment through the widening participation agenda to social inclusion and citizenship, and to providing the changing skills base necessary for the global economy. A more cynical view is that, when put under critical scrutiny, as well as being emancipatory, in some senses these changes can be seen to mobilise regulatory and disciplinary practices. This paper reflects on what kinds of teaching and learning are promoted by the new relationship between the state and the university. It argues that, whilst governmental directives for innovations and transformations in teaching and learning allegedly empower students and put their interests at the centre, reforms can also be seen to consist of supervisory and controlling mechanisms with regard both to our own practices as teachers and the knowledge/ learning we provide for the students.
Resumo:
During the 1970’s and 1980’s, the late Dr Norman Holme undertook extensive towed sledge surveys in the English Channel and some in the Irish Sea. Only a minority of the resulting images were analysed and reported before his death in 1989 but logbooks, video and film material has been archived in the National Marine Biological Library (NMBL) in Plymouth. A scoping study was therefore commissioned by the Joint Nature Conservation Committee and as a part of the Mapping European Seabed Habitats (MESH) project to identify the value of the material archived and the procedure and cost to undertake further work. The results of the scoping study are: 1. NMBL archives hold 106 videotapes (reel-to-reel Sony HD format) and 59 video cassettes (including 15 from the Irish Sea) in VHS format together with 90 rolls of 35 mm colour transparency film (various lengths up to about 240 frames per film). These are stored in the Archive Room, either in a storage cabinet or in original film canisters. 2. Reel-to-reel material is extensive and had already been selectively copied to VHS cassettes. The cost of transferring it to an accepted ‘long-life’ medium (Betamax) would be approximately £15,000. It was not possible to view the tapes as a suitable machine was not located. The value of the tapes is uncertain but they are likely to become beyond salvation within one to two years. 3. Video cassette material is in good condition and is expected to remain so for several more years at least. Images viewed were generally of poor quality and the speed of tow often makes pictures blurred. No immediate action is required. 4. Colour transparency films are in good condition and the images are very clear. They provide the best source of information for mapping seabed biotopes. They should be scanned to digital format but inexpensive fast copying is problematic as there are no between-frame breaks between images and machines need to centre the image based on between-frame breaks. The minimum cost to scan all of the images commercially is approximately £6,000 and could be as much as £40,000 on some quotations. There is a further cost in coding and databasing each image and, all-in-all it would seem most economic to purchase a ‘continuous film’ scanner and undertake the work in-house. 5. Positional information in ships logs has been matched to films and to video tapes. Decca Chain co-ordinates recorded in the logbooks have been converted to latitude and longitude (degrees, minutes and seconds) and a further routine developed to convert to degrees and decimal degrees required for GIS mapping. However, it is unclear whether corrections to Decca positions were applied at the time the position was noted. Tow tracks have been mapped onto an electronic copy of a Hydrographic Office chart. 6. The positions of start and end of each tow were entered to a spread sheet so that they can be displayed on GIS or on a Hydrographic Office Chart backdrop. The cost of the Hydrographic Office chart backdrop at a scale of 1:75,000 for the whole area was £458 incl. VAT. 7. Viewing all of the video cassettes to note habitats and biological communities, even by an experienced marine biologist, would take at least in the order of 200 hours and is not recommended. English Channel towed sledge seabed images. Phase 1: scoping study and example analysis. 6 8. Once colour transparencies are scanned and indexed, viewing to identify seabed habitats and biological communities would probably take about 100 hours for an experienced marine biologist and is recommended. 9. It is expected that identifying biotopes along approximately 1 km lengths of each tow would be feasible although uncertainties about Decca co-ordinate corrections and exact positions of images most likely gives a ±250 m position error. More work to locate each image accurately and solve the Decca correction question would improve accuracy of image location. 10. Using codings (produced by Holme to identify different seabed types), and some viewing of video and transparency material, 10 biotopes have been identified, although more would be added as a result of full analysis. 11. Using the data available from the Holme archive, it is possible to populate various fields within the Marine Recorder database. The overall ‘survey’ will be ‘English Channel towed video sled survey’. The ‘events’ become the 104 tows. Each tow could be described as four samples, i.e. the start and end of the tow and two areas in the middle to give examples along the length of the tow. These samples would have their own latitude/longitude co-ordinates. The four samples would link to a GIS map. 12. Stills and video clips together with text information could be incorporated into a multimedia presentation, to demonstrate the range of level seabed types found along a part of the northern English Channel. More recent images taken during SCUBA diving of reef habitats in the same area as the towed sledge surveys could be added to the Holme images.
Resumo:
Noise is one of the main factors degrading the quality of original multichannel remote sensing data and its presence influences classification efficiency, object detection, etc. Thus, pre-filtering is often used to remove noise and improve the solving of final tasks of multichannel remote sensing. Recent studies indicate that a classical model of additive noise is not adequate enough for images formed by modern multichannel sensors operating in visible and infrared bands. However, this fact is often ignored by researchers designing noise removal methods and algorithms. Because of this, we focus on the classification of multichannel remote sensing images in the case of signal-dependent noise present in component images. Three approaches to filtering of multichannel images for the considered noise model are analysed, all based on discrete cosine transform in blocks. The study is carried out not only in terms of conventional efficiency metrics used in filtering (MSE) but also in terms of multichannel data classification accuracy (probability of correct classification, confusion matrix). The proposed classification system combines the pre-processing stage where a DCT-based filter processes the blocks of the multichannel remote sensing image and the classification stage. Two modern classifiers are employed, radial basis function neural network and support vector machines. Simulations are carried out for three-channel image of Landsat TM sensor. Different cases of learning are considered: using noise-free samples of the test multichannel image, the noisy multichannel image and the pre-filtered one. It is shown that the use of the pre-filtered image for training produces better classification in comparison to the case of learning for the noisy image. It is demonstrated that the best results for both groups of quantitative criteria are provided if a proposed 3D discrete cosine transform filter equipped by variance stabilizing transform is applied. The classification results obtained for data pre-filtered in different ways are in agreement for both considered classifiers. Comparison of classifier performance is carried out as well. The radial basis neural network classifier is less sensitive to noise in original images, but after pre-filtering the performance of both classifiers is approximately the same.
Resumo:
A well documented, publicly available, global data set of surface ocean carbon dioxide (CO2) parameters has been called for by international groups for nearly two decades. The Surface Ocean CO2 Atlas (SOCAT) project was initiated by the international marine carbon science community in 2007 with the aim of providing a comprehensive, publicly available, regularly updated, global data set of marine surface CO2, which had been subject to quality control (QC). Many additional CO2 data, not yet made public via the Carbon Dioxide Information Analysis Center (CDIAC), were retrieved from data originators, public websites and other data centres. All data were put in a uniform format following a strict protocol. Quality control was carried out according to clearly defined criteria. Regional specialists performed the quality control, using state-of-the-art web-based tools, specially developed for accomplishing this global team effort. SOCAT version 1.5 was made public in September 2011 and holds 6.3 million quality controlled surface CO2 data points from the global oceans and coastal seas, spanning four decades (1968–2007). Three types of data products are available: individual cruise files, a merged complete data set and gridded products. With the rapid expansion of marine CO2 data collection and the importance of quantifying net global oceanic CO2 uptake and its changes, sustained data synthesis and data access are priorities.
Resumo:
A well-documented, publicly available, global data set of surface ocean carbon dioxide (CO2) parameters has been called for by international groups for nearly two decades. The Surface Ocean CO2 Atlas (SOCAT) project was initiated by the international marine carbon science community in 2007 with the aim of providing a comprehensive, publicly available, regularly updated, global data set of marine surface CO2, which had been subject to quality control (QC). Many additional CO2 data, not yet made public via the Carbon Dioxide Information Analysis Center (CDIAC), were retrieved from data originators, public websites and other data centres. All data were put in a uniform format following a strict protocol. Quality control was carried out according to clearly defined criteria. Regional specialists performed the quality control, using state-of-the-art web-based tools, specially developed for accomplishing this global team effort. SOCAT version 1.5 was made public in September 2011 and holds 6.3 million quality controlled surface CO2 data points from the global oceans and coastal seas, spanning four decades (1968–2007). Three types of data products are available: individual cruise files, a merged complete data set and gridded products. With the rapid expansion of marine CO2 data collection and the importance of quantifying net global oceanic CO2 uptake and its changes, sustained data synthesis and data access are priorities.
Resumo:
The objective the study was to determine the levels of glucose and triglycerides in seminal plasma of 10 guinea pigs, which were fed for a period of 2 months with a diet containing 10% more ED. The level of glucose found in seminal plasma was 11.59 ± 0.5 mg/dL and triglyceride value was 55.95 ± 3.2 mg/dL, while the motility was 97% on average. We conclude that in guinea pigs the levels both glucose and triglycerides were increased by major level of ED in feed, but the spermatic motility was not.
Resumo:
Queen's University Library was one of 202 libraries, including 57 members of the Association of Research Libraries (ARL), to survey its users in spring 2004 using the LibQUAL+ survey instrument. LibQUAL+ was designed by ARL to assist libraries in assessing the quality of their services and identifying areas for improvement. # Overall: Queen's scored higher than the average for all ARL participants and 1st among the 2004 Canadian participants. This relatively high rating is due to very high scores in the dimensions of Library as Place and Affect of Service. However, there is considerable need for improvement in the area of Information Control where Queen's rated well below the ARL average. # Affect of Service: Queen's strong overall ratings are supported by the many respondent comments praising customer service throughout the system. The ratings and survey comments indicate greatest appreciation by faculty and more experienced students (e.g. graduate students) for the instruction and on-site services provided by the libraries. The ratings also indicate that undergraduates, growing up with the web, want and expected to be able to access library resources independently and do not value these services as highly. The comments also indicated some specific areas for improvement throughout the library system. # Library as Place : All Queen's libraries except for Law ranked well above the ARL and Canadian averages. Overall, Library as Place ranked lowest in importance among the service dimensions for all ARL participants including Queen's. Comparative analysis of LibQUAL results since the survey began shows a decline in “desired” ratings for Library as Place. However, undergraduates continue to give strong "desired" ratings to certain aspects of Library as Place and a relatively high rating for "minimum expected" service. The comments from Queen's survey respondents and ARL's analyses of focus groups indicate that undergraduates value the library much more as a place to study and work with peers rather than for its on-site resources and services. # Information Control: This is the area in greatest need of attention. While it ranked highest in importance for all user groups by a wide margin, Queen's performed poorly in this category. Overall, Queen's ranked far below both the ARL average and the top three Canadian scores. However, the major dissatisfaction was concentrated in the humanities/social sciences (Stauffer primary users) and the health sciences (Bracken primary users) where the overall rating of perceived service quality ranked below the minimum expected service rating. Primary users of the Education, Engineering/Science and Law libraries rated this service dimension higher than the ARL average. The great success of the Canadian National Site License Program (CNSLP) is reflected in the high overall rating generated by Engineering/Science Library users. The low ratings from the humanities and social sciences are supported by respondents' comments and are generally consistent with other ARL participants.
Resumo:
Background A previous review suggested that the MacNew Quality of Life Questionnaire was the most appropriate disease-specific measure of health-related quality of life among people with ischaemic heart disease. However, there is ambiguity about the allocation of items to the three factors underlying the MacNew and the factor structure has not been confirmed previously among the people in the UK. Methods The MacNew Questionnaire and the SF-36 were administered to 117 newly admitted patients to a tertiary referral centre in Northern Ireland. All patients had been diagnosed with ischaemic heart disease. Results A confirmatory factor analysis was conducted on the factor structure of the MacNew and the model was found to be an inadequate fit of the data. A quantitative and qualitative analysis of the items suggested that a five factor solution was more appropriate and this was validated by confirmatory factor analysis. This new structure also displayed strong evidence of concurrent validity when compared to the SF-36. Conclusion We recommend that researchers should submit scores obtained from items on the MacNew to secondary analyses after being grouped according to the factor structure proposed in this paper, in order to explore further the most appropriate grouping of items.
Resumo:
Can learning quality be maintained in the face of increasing class size by the use of Computer Supported Co-operative Learning (CSCL) technologies? In particular, can Computer-Mediated Communication promote critical thinking in addition to surface information transfer? We compared face-to-face seminars with asynchronous computer conferencing in the same Information Management class. From Garrison's theory of critical thinking and Henri's critical reasoning skills, we developed two ways of evaluating critical thinking: a student questionnaire and a content analysis technique. We found evidence for critical thinking in both situations, with some subtle differences in learning style. This paper provides an overview of this work.
Visual functioning and quality of life in the subfoveal radiotherapy study (SFRADS): SFRADS report 2
Resumo:
Aims: To determine whether or not self reported visual functioning and quality of life in patients with choroidal neovascularisation caused by age related macular degeneration (AMD) is better in those treated with 12 Gy external beam radiotherapy in comparison with untreated subjects. Methods: A multicentre single masked randomised controlled trial of 12 Gy of external beam radiation therapy (EBRT) delivered as 6x2 Gy fractions to the macula of an affected eye versus observation. Patients with AMD, aged 60 years or over, in three UK hospital units, who had subfoveal CNV and a visual acuity equal to or better than 6/60 (logMAR 1.0). Methods: Data from 199 eligible participants who were randomly assigned to 12 Gy teletherapy or observation were available for analysis. Visual function assessment, ophthalmic examination, and fundus fluorescein angiography were undertaken at baseline and at 3, 6, 12, and 24 months after study entry. To assess patient centred outcomes, subjects were asked to complete the Daily Living Tasks Dependent on Vision (DLTV) and the SF-36 questionnaires at baseline, 6, 12, and 24 months after enrolment to the study. Cross sectional and longitudinal analyses were conducted using arm of study as grouping variable. Regression analysis was employed to adjust for the effect of baseline co-variates on outcome at 12 months and 24 months. Results: Both control and treated subjects had significant losses in visual functioning as seen by a progressive decline in mean scores in the four dimensions of the DLTV. There were no statistically significant differences between treatment and control subjects in any of dimensions of the DLTV at 12 months or 24 months after study entry. Regression analysis confirmed that treatment status had no effect on the change in DLTV dimensional scores. Conclusions: The small benefits noted in clinical measures of vision in treated eyes did not translate into better self reported visual functioning in patients who received treatment when compared with the control arm. These findings have implications for the design of future clinical trials and studies.
Resumo:
Purpose – The aim of this paper is to analyse how critical incidents or organisational crises can be used to check and legitimise quality management change efforts in relation to the fundamental principles of quality. Design/methodology/approach – Multiple case studies analyse critical incidents that demonstrate the importance of legitimisation, normative evaluation and conflict constructs in this process. A theoretical framework composed of these constructs is used to guide the analysis. Findings – The cases show that the critical incidents leading to the legitimisation of continuous improvement (CI) were diverse. However all resulted in the need for significant ongoing cost reduction to achieve or retain competitiveness. In addition, attempts at legitimising CI were coupled with attempts at destabilising the existing normative practice. This destabilisation process, in some cases, advocated supplementing the existing approaches and in others replacing them. In all cases, significant conflict arose in these legitimising and normative evaluation processes. Research limitations/implications – It is suggested that further research could involve a critical analysis of existing quality models, tools and techniques in relation to how they incorporate, and are built upon, fundamental quality management principles. Furthermore, such studies could probe the dangers of quality curriculum becoming divorced from business and market reality and thus creating a parallel existence. Practical implications – As demonstrated by the case studies, models, tools and techniques are not valued for their intrinsic value but rather for what they will contribute to addressing the business needs. Thus, in addition to being an opportunity for quality management, critical incidents present a challenge to the field. Quality management must be shown to make a contribution in these circumstances. Originality/value – This paper is of value to both academics and practitioners.
Resumo:
Permeable reactive barriers (PRBs) of zero-valent iron (Fe0) are increasingly being used to remediate contaminated ground water. Corrosion of Fe0 filings and the formation of precipitates can occur when the PRB material comes in contact with ground water and may reduce the lifespan and effectiveness of the barrier. At present, there are no routine procedures for preparing and analyzing the mineral precipitates from Fe0 PRB material. These procedures are needed because mineralogical composition of corrosion products used to interpret the barrier processes can change with iron oxidation and sample preparation. The objectives of this study were (i) to investigate a method of preparing Fe0 reactive barrier material for mineralogical analysis by X-ray diffraction (XRD), and (ii) to identify Fe mineral phases and rates of transformations induced by different mineralogical preparation techniques. Materials from an in situ Fe0 PRB were collected by undisturbed coring and processed for XRD analysis after different times since sampling for three size fractions and by various drying treatments. We found that whole-sample preparation for analysis was necessary because mineral precipitates occurred within the PRB material in different size fractions of the samples. Green rusts quickly disappeared from acetone-dried samples and were not present in air-dried and oven-dried samples. Maghemite/magnetite content increased over time and in oven-dried samples, especially after heating to 105°C. We conclude that care must be taken during sample preparation of Fe0 PRB material, especially for detection of green rusts, to ensure accurate identification of minerals present within the barrier system.
Resumo:
Wireless enabled portable devices must operate with the highest possible energy efficiency while still maintaining a minimum level and quality of service to meet the user's expectations. The authors analyse the performance of a new pointer-based medium access control protocol that was designed to significantly improve the energy efficiency of user terminals in wireless local area networks. The new protocol, pointer controlled slot allocation and resynchronisation protocol (PCSAR), is based on the existing IEEE 802.11 point coordination function (PCF) standard. PCSAR reduces energy consumption by removing the need for power saving stations to remain awake and listen to the channel. Using OPNET, simulations were performed under symmetric channel loading conditions to compare the performance of PCSAR with the infrastructure power saving mode of IEEE 802.11, PCF-PS. The simulation results demonstrate a significant improvement in energy efficiency without significant reduction in performance when using PCSAR. For a wireless network consisting of an access point and 8 stations in power saving mode, the energy saving was up to 31% while using PCSAR instead of PCF-PS, depending upon frame error rate and load. The results also show that PCSAR offers significantly reduced uplink access delay over PCF-PS while modestly improving uplink throughput.