230 resultados para Log conformance
Resumo:
Studies have examined the associations between cancers and circulating 25-hydroxyvitamin D [25(OH)D], but little is known about the impact of different laboratory practices on 25(OH)D concentrations. We examined the potential impact of delayed blood centrifuging, choice of collection tube, and type of assay on 25(OH)D concentrations. Blood samples from 20 healthy volunteers underwent alternative laboratory procedures: four centrifuging times (2, 24, 72, and 96 h after blood draw); three types of collection tubes (red top serum tube, two different plasma anticoagulant tubes containing heparin or EDTA); and two types of assays (DiaSorin radioimmunoassay [RIA] and chemiluminescence immunoassay [CLIA/LIAISON®]). Log-transformed 25(OH)D concentrations were analyzed using the generalized estimating equations (GEE) linear regression models. We found no difference in 25(OH)D concentrations by centrifuging times or type of assay. There was some indication of a difference in 25(OH)D concentrations by tube type in CLIA/LIAISON®-assayed samples, with concentrations in heparinized plasma (geometric mean, 16.1 ng ml−1) higher than those in serum (geometric mean, 15.3 ng ml−1) (p = 0.01), but the difference was significant only after substantial centrifuging delays (96 h). Our study suggests no necessity for requiring immediate processing of blood samples after collection or for the choice of a tube type or assay.
Resumo:
This paper reports on the opportunities for transformational learning experienced by a group of pre-service teachers who were engaged in service-learning as a pedagogical process with a focus on reflection. Critical social theory informed the design of the reflection process as it enabled a move away from knowledge transmission toward knowledge transformation. The structured reflection log was designed to illustrate the critical social theory expectations of quality learning that teach students to think critically: ideology critique and utopian critique. Butin's lenses and a reflection framework informed by the work of Bain, Ballantyne, Mills and Lester were used in the design of the service-learning reflection log. Reported data provide evidence of transformational learning and highlight how the students critique their world and imagine how they could contribute to a better world in their work as a beginning teacher.
Resumo:
ERP systems generally implement controls to prevent certain common kinds of fraud. In addition however, there is an imperative need for detection of more sophisticated patterns of fraudulent activity as evidenced by the legal requirement for company audits and the common incidence of fraud. This paper describes the design and implementation of a framework for detecting patterns of fraudulent activity in ERP systems. We include the description of six fraud scenarios and the process of specifying and detecting the occurrence of those scenarios in ERP user log data using the prototype software which we have developed. The test results for detecting these scenarios in log data have been verified and confirm the success of our approach which can be generalized to ERP systems in general.
Resumo:
Gabor representations have been widely used in facial analysis (face recognition, face detection and facial expression detection) due to their biological relevance and computational properties. Two popular Gabor representations used in literature are: 1) Log-Gabor and 2) Gabor energy filters. Even though these representations are somewhat similar, they also have distinct differences as the Log-Gabor filters mimic the simple cells in the visual cortex while the Gabor energy filters emulate the complex cells, which causes subtle differences in the responses. In this paper, we analyze the difference between these two Gabor representations and quantify these differences on the task of facial action unit (AU) detection. In our experiments conducted on the Cohn-Kanade dataset, we report an average area underneath the ROC curve (A`) of 92.60% across 17 AUs for the Gabor energy filters, while the Log-Gabor representation achieved an average A` of 96.11%. This result suggests that small spatial differences that the Log-Gabor filters pick up on are more useful for AU detection than the differences in contours and edges that the Gabor energy filters extract.
Resumo:
This paper presents the results from a study of information behaviors in the context of people's everyday lives undertaken in order to develop an integrated model of information behavior (IB). 34 participants from across 6 countries maintained a daily information journal or diary – mainly through a secure web log – for two weeks, to an aggregate of 468 participant days over five months. The text-rich diary data was analyzed using a multi-method qualitative-quantitative analysis in the following order: Grounded Theory analysis with manual coding, automated concept analysis using thesaurus-based visualization, and finally a statistical analysis of the coding data. The findings indicate that people engage in several information behaviors simultaneously throughout their everyday lives (including home and work life) and that sense-making is entangled in all aspects of them. Participants engaged in many of the information behaviors in a parallel, distributed, and concurrent fashion: many information behaviors for one information problem, one information behavior across many information problems, and many information behaviors concurrently across many information problems. Findings indicate also that information avoidance – both active and passive avoidance – is a common phenomenon and that information organizing behaviors or the lack thereof caused the most problems for participants. An integrated model of information behaviors is presented based on the findings.
Resumo:
This paper introduces a novel technique to directly optimise the Figure of Merit (FOM) for phonetic spoken term detection. The FOM is a popular measure of sTD accuracy, making it an ideal candiate for use as an objective function. A simple linear model is introduced to transform the phone log-posterior probabilities output by a phe classifier to produce enhanced log-posterior features that are more suitable for the STD task. Direct optimisation of the FOM is then performed by training the parameters of this model using a non-linear gradient descent algorithm. Substantial FOM improvements of 11% relative are achieved on held-out evaluation data, demonstrating the generalisability of the approach.
Resumo:
This study assessed the reliability and validity of a palm-top-based electronic appetite rating system (EARS) in relation to the traditional paper and pen method. Twenty healthy subjects [10 male (M) and 10 female (F)] — mean age M=31 years (S.D.=8), F=27 years (S.D.=5); mean BMI M=24 (S.D.=2), F=21 (S.D.=5) — participated in a 4-day protocol. Measurements were made on days 1 and 4. Subjects were given paper and an EARS to log hourly subjective motivation to eat during waking hours. Food intake and meal times were fixed. Subjects were given a maintenance diet (comprising 40% fat, 47% carbohydrate and 13% protein by energy) calculated at 1.6×Resting Metabolic Rate (RMR), as three isoenergetic meals. Bland and Altman's test for bias between two measurement techniques found significant differences between EARS and paper and pen for two of eight responses (hunger and fullness). Regression analysis confirmed that there were no day, sex or order effects between ratings obtained using either technique. For 15 subjects, there was no significant difference between results, with a linear relationship between the two methods that explained most of the variance (r2 ranged from 62.6 to 98.6). The slope for all subjects was less than 1, which was partly explained by a tendency for bias at the extreme end of results on the EARS technique. These data suggest that the EARS is a useful and reliable technique for real-time data collection in appetite research but that it should not be used interchangeably with paper and pen techniques.
Resumo:
This dissertation is primarily an applied statistical modelling investigation, motivated by a case study comprising real data and real questions. Theoretical questions on modelling and computation of normalization constants arose from pursuit of these data analytic questions. The essence of the thesis can be described as follows. Consider binary data observed on a two-dimensional lattice. A common problem with such data is the ambiguity of zeroes recorded. These may represent zero response given some threshold (presence) or that the threshold has not been triggered (absence). Suppose that the researcher wishes to estimate the effects of covariates on the binary responses, whilst taking into account underlying spatial variation, which is itself of some interest. This situation arises in many contexts and the dingo, cypress and toad case studies described in the motivation chapter are examples of this. Two main approaches to modelling and inference are investigated in this thesis. The first is frequentist and based on generalized linear models, with spatial variation modelled by using a block structure or by smoothing the residuals spatially. The EM algorithm can be used to obtain point estimates, coupled with bootstrapping or asymptotic MLE estimates for standard errors. The second approach is Bayesian and based on a three- or four-tier hierarchical model, comprising a logistic regression with covariates for the data layer, a binary Markov Random field (MRF) for the underlying spatial process, and suitable priors for parameters in these main models. The three-parameter autologistic model is a particular MRF of interest. Markov chain Monte Carlo (MCMC) methods comprising hybrid Metropolis/Gibbs samplers is suitable for computation in this situation. Model performance can be gauged by MCMC diagnostics. Model choice can be assessed by incorporating another tier in the modelling hierarchy. This requires evaluation of a normalization constant, a notoriously difficult problem. Difficulty with estimating the normalization constant for the MRF can be overcome by using a path integral approach, although this is a highly computationally intensive method. Different methods of estimating ratios of normalization constants (N Cs) are investigated, including importance sampling Monte Carlo (ISMC), dependent Monte Carlo based on MCMC simulations (MCMC), and reverse logistic regression (RLR). I develop an idea present though not fully developed in the literature, and propose the Integrated mean canonical statistic (IMCS) method for estimating log NC ratios for binary MRFs. The IMCS method falls within the framework of the newly identified path sampling methods of Gelman & Meng (1998) and outperforms ISMC, MCMC and RLR. It also does not rely on simplifying assumptions, such as ignoring spatio-temporal dependence in the process. A thorough investigation is made of the application of IMCS to the three-parameter Autologistic model. This work introduces background computations required for the full implementation of the four-tier model in Chapter 7. Two different extensions of the three-tier model to a four-tier version are investigated. The first extension incorporates temporal dependence in the underlying spatio-temporal process. The second extensions allows the successes and failures in the data layer to depend on time. The MCMC computational method is extended to incorporate the extra layer. A major contribution of the thesis is the development of a fully Bayesian approach to inference for these hierarchical models for the first time. Note: The author of this thesis has agreed to make it open access but invites people downloading the thesis to send her an email via the 'Contact Author' function.
Resumo:
Inspection of solder joints has been a critical process in the electronic manufacturing industry to reduce manufacturing cost, improve yield, and ensure product quality and reliability. The solder joint inspection problem is more challenging than many other visual inspections because of the variability in the appearance of solder joints. Although many research works and various techniques have been developed to classify defect in solder joints, these methods have complex systems of illumination for image acquisition and complicated classification algorithms. An important stage of the analysis is to select the right method for the classification. Better inspection technologies are needed to fill the gap between available inspection capabilities and industry systems. This dissertation aims to provide a solution that can overcome some of the limitations of current inspection techniques. This research proposes two inspection steps for automatic solder joint classification system. The “front-end” inspection system includes illumination normalisation, localization and segmentation. The illumination normalisation approach can effectively and efficiently eliminate the effect of uneven illumination while keeping the properties of the processed image. The “back-end” inspection involves the classification of solder joints by using Log Gabor filter and classifier fusion. Five different levels of solder quality with respect to the amount of solder paste have been defined. Log Gabor filter has been demonstrated to achieve high recognition rates and is resistant to misalignment. Further testing demonstrates the advantage of Log Gabor filter over both Discrete Wavelet Transform and Discrete Cosine Transform. Classifier score fusion is analysed for improving recognition rate. Experimental results demonstrate that the proposed system improves performance and robustness in terms of classification rates. This proposed system does not need any special illumination system, and the images are acquired by an ordinary digital camera. In fact, the choice of suitable features allows one to overcome the problem given by the use of non complex illumination systems. The new system proposed in this research can be incorporated in the development of an automated non-contact, non-destructive and low cost solder joint quality inspection system.
Resumo:
The late French philosopher Gilles Deleuze has enjoyed significant notoriety and acclaim in American academia over the last 20 years. The unique disciplinary focus of the contemporary discussion has derived from Deleuze the architectural possibilities of biotechnology, systems theory, and digital processualism. While the persistence of Deleuze’s theory of science and the formalist readings of Mille Plateaux and Le Bergsonisme have dominated the reception since the 1990s, few are aware of a much earlier encounter between Deleuze and architects, beginning at Columbia University in the 1970s, which converged on the radical politics of Anti-OEdipus and its American reception in the journal Semiotext(e), through which architecture engaged a much broader discourse alongside artists, musicians, filmmakers, and intellectuals in the New York aesthetic underground, of which Deleuze and Félix Guattari were themselves a part.
Resumo:
Lack of a universally accepted and comprehensive taxonomy of cybercrime seriously impedes international efforts to accurately identify, report and monitor cybercrime trends. There is, not surprisingly, a corresponding disconnect internationally on the cybercrime legislation front, a much more serious problem and one which the International Telecommunication Union (ITU) says requires „the urgent attention of all nations‟. Yet, and despite the existence of the Council of Europe Convention on Cybercrime, a proposal for a global cybercrime treaty was rejected by the United Nations (UN) as recently as April 2010. This paper presents a refined and comprehensive taxonomy of cybercrime and demonstrates its utility for widespread use. It analyses how the USA, the UK, Australia and the UAE align with the CoE Convention and finds that more needs to be done to achieve conformance. We conclude with an analysis of the approaches used in Australia, in Queensland, and in the UAE, in Abu Dhabi, to fight cybercrime and identify a number of shared problems.
Resumo:
Information behaviour (IB) is an area within Library and Information Science that studies the totality of human behaviour in relation to information, both active and passive, along with the explicit and the tacit mental states related to information. This study reports on a recently completed dissertation research that integrates the different models of information behaviours using a diary study where 34 participants maintained a daily journal for two weeks through a web log or paper diary. This resulted in thick descriptions of IB, which were manually analysed using the Grounded Theory method of inquiry, and then cross-referenced through both text-analysis and statistical analysis programs. Among the many key findings of this study, one is the focus this paper: how participants express their feelings of the information seeking process and their mental and affective states related specifically to the sense-making component which co-occurs with almost every other aspect of information behaviour. The paper title – Down the Rabbit Hole and Through the Looking Glass – refers to an observation that some of the participants made in their journals when they searched for, or avoided information, and wrote that they felt like they have fallen into a rabbit hole where nothing made sense, and reported both positive feelings of surprise and amazement, and negative feelings of confusion, puzzlement, apprehensiveness, frustration, stress, ambiguity, and fatigue. The study situates this sense-making aspects of IB within an overarching model of information behaviour that includes IB concepts like monitoring information, encountering information, information seeking and searching, flow, multitasking, information grounds, information horizons, and more, and proposes an integrated model of information behaviour illuminating how these different concepts are interleaved and inter-connected with each other, along with it's implications for information services.
Resumo:
Purpose: To examine the ability of silver nano-particles to prevent the growth of Pseudomonas aeruginosa and Staphylococcus aureus in solution or when adsorbed into contact lenses. To examine the ability of silver nano-particles to prevent the growth of Acanthamoeba castellanii. ----- ----- Methods: Etafilcon A lenses were soaked in various concentrations of silver nano-particles. Bacterial cells were then exposed to these lenses, and numbers of viable cells on lens surface or in solution compared to etafilcon A lenses not soaked in silver. Acanthamoeba trophozoites were exposed to silver nano-particles and their ability to form tracks was examined. ----- ----- Results: Silver nano-particle containing lenses reduced bacterial viability and adhesion. There was a dose-dependent response curve, with 10 ppm or 20 ppm silver showing > 5 log reduction in bacterial viability in solution or on the lens surface. For Acanthamoeba, 20 ppm silver reduced the ability to form tracks by approximately 1 log unit. ----- ----- Conclusions: Silver nanoparticles are effective antimicrobial agents, and can reduce the ability of viable bacterial cells to colonise contact lenses once incorporated into the lens.----- ----- Resumen: Objetivos: Examinar la capacidad de las nanopartículas de plata para prevenir el crecimiento de Pseudomonas aeruginosa y Staphylococcus aureus en soluciones para lentes de contacto o cuando éstas las adsorben. Examinar la capacidad de las nanopartículas de plata para prevenir el crecimiento de Acanthamoeba castellanii.----- ----- Métodos: Se sumergieron lentes etafilcon A en diversas concentraciones de nanopartículas de plata. Las células bacterianas fueron posteriormente expuestas a dichas lentes, y se compararon cantidades de células viables en la superficie de la lente o en la solución con las presentes en lentes etafilcon A que no habían sido sumergidas en plata. Trofozoítos de Acanthamoeba fueron expuestos a nanopartículas de plata y se examinó su capacidad para formar quistes.----- ----- Resultados: Las lentes que contienen nanopartículas de plata redujeron la viabilidad bacteriana y la adhesión. Hubo una curva de respuesta dependiente de la dosis, en la que 10 ppm o 20 ppm de plata mostró una reducción logarítmica > 5 en la viabilidad bacteriana tanto en la solución como en la superficie de la lente. Para Acanthamoeba, 20 ppm de plata redujeron la capacidad de formar quistes en aproximadamente 1 unidad logarítmica.----- ----- Conclusiones: Las nanopartículas de plata son agentes antimicrobianos eficaces y pueden reducir la capacidad de células bacterianas viables para colonizar las lentes de contacto una vez que se han incorporado en la lente.
Resumo:
A number of instructors have recently adopted social network sites (SNSs) for learning. However, the learning design of SNSs often remains at a preliminary level similar to a personal log book because it does not properly include reflective learning elements such as individual reflection and collaboration. This article looks at the reflective learning process and the public writing process as a way of improving the quality of reflective learning on SNSs. It proposes a reflective learning model on SNSs based on two key pedagogical concepts for social networking: individual expression and collaborative connection. It is expected that the model would be helpful for instructors in designing a reflective learning process on SNSs in an effective and flexible way.
Resumo:
Throughout this workshop session we have looked at various configurations of Sage as well as using the Sage UI to run Sage applications (e.g. the image viewer). More advanced usage of Sage has been demonstrated using a Sage compatible version of Paraview highlighting the potential of parallel rendering. The aim of this tutorial session is to give a practical introduction to developing visual content for a tiled display using the Sage libraries. After completing this tutorial you should have the basic tools required to develop your own custom Sage applications. This tutorial is designed for software developers and intermediate programming knowledge is assumed, along with some introductory OpenGL . You will be required to write small portions of C/C++ code to complete this worksheet. However if you do not feel comfortable writing code (or have never written in C or C++), we will be on hand throughout this session so feel free to ask for some help. We have a number of machines in this lab running a VNC client to a virtual machine running Fedora 12. You should all be able to log in with the username “escience”, and password “escience10”. Some of the commands in this worksheet require you to run them as the root user, so note the password as you may need to use it a few times. If you need to access the Internet, then use the username “qpsf01”, password “escience10”