250 resultados para Saliva collection devices and methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Problems involving the solution of advection-diffusion-reaction equations on domains and subdomains whose growth affects and is affected by these equations, commonly arise in developmental biology. Here, a mathematical framework for these situations, together with methods for obtaining spatio-temporal solutions and steady states of models built from this framework, is presented. The framework and methods are applied to a recently published model of epidermal skin substitutes. Despite the use of Eulerian schemes, excellent agreement is obtained between the numerical spatio-temporal, numerical steady state, and analytical solutions of the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is based on an Australian Learning & Teaching Council (ALTC) funded evaluation in 13 universities across Australia and New Zealand of the use of Engineers Without Borders (EWB) projects in first-year engineering courses. All of the partner institutions have implemented this innovation differently and comparison of these implementations affords us the opportunity to assemble "a body of carefully gathered data that provides evidence of which approaches work for which students in which learning environments". This study used a mixed-methods data collection approach and a realist analysis. Data was collected by program logic analysis with course co-ordinators, observation of classes, focus groups with students, exit survey of students and interviews with staff as well as scrutiny of relevant course and curriculum documents. Course designers and co-ordinators gave us a range of reasons for using the projects, most of which alluded to their presumed capacity to deliver experience in and learning of higher order thinking skills in areas such as sustainability, ethics, teamwork and communication. For some students, however, the nature of the projects decreased their interest in issues such as ethical development, sustainability and how to work in teams. We also found that the projects provoked different responses from students depending on the nature of the courses in which they were embedded (general introduction, design, communication, or problem-solving courses) and their mode of delivery (lecture, workshop or online).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years considerable attention has been paid to the numerical solution of stochastic ordinary differential equations (SODEs), as SODEs are often more appropriate than their deterministic counterparts in many modelling situations. However, unlike the deterministic case numerical methods for SODEs are considerably less sophisticated due to the difficulty in representing the (possibly large number of) random variable approximations to the stochastic integrals. Although Burrage and Burrage [High strong order explicit Runge-Kutta methods for stochastic ordinary differential equations, Applied Numerical Mathematics 22 (1996) 81-101] were able to construct strong local order 1.5 stochastic Runge-Kutta methods for certain cases, it is known that all extant stochastic Runge-Kutta methods suffer an order reduction down to strong order 0.5 if there is non-commutativity between the functions associated with the multiple Wiener processes. This order reduction down to that of the Euler-Maruyama method imposes severe difficulties in obtaining meaningful solutions in a reasonable time frame and this paper attempts to circumvent these difficulties by some new techniques. An additional difficulty in solving SODEs arises even in the Linear case since it is not possible to write the solution analytically in terms of matrix exponentials unless there is a commutativity property between the functions associated with the multiple Wiener processes. Thus in this present paper first the work of Magnus [On the exponential solution of differential equations for a linear operator, Communications on Pure and Applied Mathematics 7 (1954) 649-673] (applied to deterministic non-commutative Linear problems) will be applied to non-commutative linear SODEs and methods of strong order 1.5 for arbitrary, linear, non-commutative SODE systems will be constructed - hence giving an accurate approximation to the general linear problem. Secondly, for general nonlinear non-commutative systems with an arbitrary number (d) of Wiener processes it is shown that strong local order I Runge-Kutta methods with d + 1 stages can be constructed by evaluated a set of Lie brackets as well as the standard function evaluations. A method is then constructed which can be efficiently implemented in a parallel environment for this arbitrary number of Wiener processes. Finally some numerical results are presented which illustrate the efficacy of these approaches. (C) 1999 Elsevier Science B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Collaboration between faculty and librarians is an important topic of discussion and research among academic librarians. These partnerships between faculty and librarians are vital for enabling students to become lifelong learners through their information literacy education. This research developed an understanding of academic collaborators by analyzing a community college faculty's teaching social networks. A teaching social network, an original term generated in this study, is comprised of communications that influence faculty when they design and deliver their courses. The communication may be formal (e.g., through scholarly journals and professional development activities) and informal (e.g., through personal communication) through their network elements. Examples of the elements of a teaching social network may be department faculty, administration, librarians, professional development, and students. This research asked 'What is the nature of faculty's teaching social networks and what are the implications for librarians?' This study moves forward the existing research on collaboration, information literacy, and social network analysis. It provides both faculty and librarians with added insight into their existing and potential relationships. This research was undertaken using mixed methods. Social network analysis was the quantitative data collection methodology and the interview method was the qualitative technique. For the social network analysis data, a survey was sent to full-time faculty at Las Positas College, a community college, in California. The survey gathered the data and described the teaching social networks for faculty with respect to their teaching methods and content taught. Semi-structured interviews were conducted following the survey with a sub-set of survey respondents to understand why specific elements were included in their teaching social networks and to learn of ways for librarians to become an integral part of the teaching social networks. The majority of the faculty respondents were moderately influenced by the elements of their network except the majority of the potentials were weakly influenced by the elements in their network in their content taught. The elements with the most influence on both teaching methods and content taught were students, department faculty, professional development, and former graduate professors and coursework. The elements with the least influence on both aspects were public or academic librarians, and social media. The most popular roles for the elements were conversations about teaching, sharing ideas, tips for teaching, insights into teaching, suggestions for ways of teaching, and how to engage students. Librarians' weakly influenced faculty in their teaching methods and their content taught. The motivating factors for collaboration with librarians were that students learned how to research, students' research projects improved, faculty saved time by having librarians provide the instruction to students, and faculty built strong working relationships with librarians. The challenges of collaborating with librarians were inadequate teaching techniques used when librarians taught research orientations and lack of time. Ways librarians can be more integral in faculty's teaching social networks included: more workshops for faculty, more proactive interaction with faculty, and more one-on-one training sessions for faculty. Some of the recommendations for the librarians from this study were develop a strong rapport with faculty, librarians should build their services in information literacy from the point of view of the faculty instead of from the librarian perspective, use staff development funding to attend conferences and workshops to improve their teaching, develop more training sessions for faculty, increase marketing efforts of the librarian's instructional services, and seek grant opportunities to increase funding for the library. In addition, librarians and faculty should review the definitions of information literacy and move from a skills based interpretation to a learning process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Food prices and food affordability are important determinants of food choices, obesity and non-communicable diseases. As governments around the world consider policies to promote the consumption of healthier foods, data on the relative price and affordability of foods, with a particular focus on the difference between ‘less healthy’ and ‘healthy’ foods and diets, are urgently needed. This paper briefly reviews past and current approaches to monitoring food prices, and identifies key issues affecting the development of practical tools and methods for food price data collection, analysis and reporting. A step-wise monitoring framework, including measurement indicators, is proposed. ‘Minimal’ data collection will assess the differential price of ‘healthy’ and ‘less healthy’ foods; ‘expanded’ monitoring will assess the differential price of ‘healthy’ and ‘less healthy’ diets; and the ‘optimal’ approach will also monitor food affordability, by taking into account household income. The monitoring of the price and affordability of ‘healthy’ and ‘less healthy’ foods and diets globally will provide robust data and benchmarks to inform economic and fiscal policy responses. Given the range of methodological, cultural and logistical challenges in this area, it is imperative that all aspects of the proposed monitoring framework are tested rigorously before implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the capacity of digital storytelling to document research activity in the creative and performing arts. In particular, it seeks to identify the thought processes and methods that underpin this research and to capture them using the digital storytelling medium. Interest in this issue was prompted by the author’s work with the creative and performing artists from the Queensland Conservatorium and the Queensland College of Art as part of the Federal government’s Research Quality Framework (RQF) in 2007. The RQF compelled artists to address what it means to undertake research in their disciplines, to describe this, measure it and quantify it; for many practitioners this represents a significant challenge. These issues continue to be pertinent in the context of the Excellence in Research for Australia (ERA) initiative. This research is significant because it seeks to identify, in layman’s terms, the research methods and thought processes used by artists in their research practice. It seeks to do so free of the encumbrances of the professional doctorate policies, the higher education research quality frameworks, and the dense philosophical debates that have to-date dominated discussions of this issue. The research involves qualitative data collection methods including a detailed literature review, interviews with key practitioners and academics involved in the creative and performing arts, and three case studies. The literature review focuses on publications that explore issues of research practice and method in the creative and performing arts. The case studies involve three Queensland-based artists. Digital stories will be developed (and presented) with Marcus and Mafe using their visual materials and drawing on the issues identified in the literature review and interviews. Emmerson’s DVD provided a point of comparison with the digital stories. (Brief bios are attached)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Cardiovascular disease is the leading cause of death in the world. Human C-reactive protein (CRP) has been used in the risk assessment of coronary events. Human saliva mirrors the body's health and well-being and is non-invasive, easy to collect and ideal for third world countries as well as for large patient screening. The aim was to establish a saliva CRP reference range and to demonstrate the clinical utility of salivary CRP levels in assessing the coronary events in a primary health care setting. Methods: We have used a homogeneous bead based assay to detect CRP levels in human saliva. We have developed a rapid 15 min (vs 90 min), sequential, one-step assay to detect CRP in saliva. Saliva was collected from healthy volunteers (n = 55, ages 20-70 years) as well as from cardiac patients (n = 28, ages 43-86 years). Results: The assay incubation time was optimised from 90 min to 15 mm and generated a positive correlation (n = 29, range 10-2189 pg/mL, r2 = 0.94; Passing Bablok slope 0.885. Intercept 0, p>0.10), meaning we could decrease the incubation time and produce equivalent results with confidence. The mean CRP level in the saliva of healthy human volunteers was 285 pg/mL and in cardiac patients was 1680 pg/mL (p<0.01). Analysis of CRP concentrations in paired serum and saliva samples from cardiac patients gave a positive correlation (r2 = 0.84, p<0.001) and the salivary CRP concentration capable of distinguishing healthy from diseased patients. Conclusions: The results suggest that this minimally invasive, rapid and sensitive assay will be useful in large patient screening studies for risk assessment of coronary events. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Head and neck cancers (HNCs) represent a significant and ever-growing burden to the modern society, mainly due to the lack of early diagnostic methods. A significant number of HNCs is often associated with drinking, smoking, chewing beetle nut, and human papilloma virus (HPV) infections. We have analyzed DNA methylation patterns in tumor and normal tissue samples collected from head and neck squamous cell carcinoma (HNSCC) patients who were smokers. We have identified novel methylation sites in the promoter of the mediator complex subunit 15 (MED15/PCQAP) gene (encoing a co-factor important for regulation of transcription initiation for promoters of many genes), hypermethylated specifically in tumor cells. Two clusters of CpG dinucleotides methylated in tumors, but not in normal tissue from the same patients, were identified. These CpG methylation events in saliva samples were further validated in a separate cohort of HNSCC patients (who developed cancer due to smoking or HPV infections) and healthy controls using methylation-specific PCR (MSP). We used saliva as a biological medium because of its non-invasive nature, close proximity to the tumors, easiness and it is an economically viable option for large-scale screening studies. The methylation levels for the two identified CpG clusters were significantly different between the saliva samples collected from healthy controls and HNSCC individuals (Welch's t-test returning P, 0.05 and Mann-Whitney test P, 0.01 for both). The developed MSP assays also provided a good discriminative ability with AUC values of 0.70 (P, 0.01) and 0.63 (P, 0.05). The identified novel CpG methylation sites may serve as potential non-invasive biomarkers for detecting HNSCC. © the authors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, integration of small-scale electricity generators, known as Distributed Generation (DG), into distribution networks has become increasingly popular. This tendency together with the falling price of DG units has a great potential in giving the DG a better chance to participate in voltage regulation process, in parallel with other regulating devices already available in the distribution systems. The voltage control issue turns out to be a very challenging problem for distribution engineers, since existing control coordination schemes need to be reconsidered to take into account the DG operation. In this paper, a control coordination approach is proposed, which is able to utilize the ability of the DG as a voltage regulator, and at the same time minimize the interaction of DG with another DG or other active devices, such as On-load Tap Changing Transformer (OLTC). The proposed technique has been developed based on the concepts of protection principles (magnitude grading and time grading) for response coordination of DG and other regulating devices and uses Advanced Line Drop Compensators (ALDCs) for implementation. A distribution feeder with tap changing transformer and DG units has been extracted from a practical system to test the proposed control technique. The results show that the proposed method provides an effective solution for coordination of DG with another DG or voltage regulating devices and the integration of protection principles has considerably reduced the control interaction to achieve the desired voltage correction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biofilms are a complex group of microbial cells that adhere to the exopolysaccharide matrix present on the surface of medical devices. Biofilm-associated infections in the medical devices pose a serious problem to the public health and adversely affect the function of the device. Medical implants used in oral and orthopedic surgery are fabricated using alloys such as stainless steel and titanium. The biological behavior, such as osseointegration and its antibacterial activity, essentially depends on both the chemical composition and the morphology of the surface of the device. Surface treatment of medical implants by various physical and chemical techniques are attempted in order to improve their surface properties so as to facilitate bio-integration and prevent bacterial adhesion. The potential source of infection of the surrounding tissue and antimicrobial strategies are from bacteria adherent to or in a biofilm on the implant which should prevent both biofilm formation and tissue colonization. This article provides an overview of bacterial biofilm formation and methods adopted for the inhibition of bacterial adhesion on medical implants

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"First published in 1988, Ecological and Behavioral Methods for the Study of Bats is widely acknowledged as the primary reference for both amateur and professional bat researchers. Bats are the second most diverse group of mammals on the earth. They live on every continent except Antarctica, ranging from deserts to tropical forests to mountains, and their activities have a profound effect on the ecosystems in which they live. Despite their ubiquity and importance, bats are challenging to study. This volume provides researchers, conservationists, and consultants with the ecological background and specific information essential for studying bats in the wild and in captivity. Chapters detail many of the newest and most commonly used field and laboratory techniques needed to advance the study of bats, describe how these methods are applied to the study of the ecology and behavior of bats, and offer advice on how to interpret the results of research. The book includes forty-three chapters, fourteen of which are new to the second edition, with information on molecular ecology and evolution, bioacoustics, chemical communication, flight dynamics, population models, and methods for assessing postnatal growth and development. Fully illustrated and featuring contributions from the world’s leading experts in bat biology, this reference contains everything bat researchers and natural resource managers need to know for the study and conservation of this wide-ranging, ecologically vital, and diverse taxon."--Publisher website

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current mobile devices and streaming video services support high definition (HD) video, increasing expectation for more contents. HD video streaming generally requires large bandwidth, exerting pressures on existing networks. New generation of video compression codecs, such as VP9 and H.265/HEVC, are expected to be more effective for reducing bandwidth. Existing studies to measure the impact of its compression on users’ perceived quality have not been focused on mobile devices. Here we propose new Quality of Experience (QoE) models that consider both subjective and objective assessments of mobile video quality. We introduce novel predictors, such as the correlations between video resolution and size of coding unit, and achieve a high goodness-of-fit to the collected subjective assessment data (adjusted R-square >83%). The performance analysis shows that H.265 can potentially achieve 44% to 59% bit rate saving compared to H.264/AVC, slightly better than VP9 at 33% to 53%, depending on video content and resolution.