981 resultados para Extended techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evidence within Australia and internationally suggests parenthood as a risk factor for inactivity; however, research into understanding parental physical activity is scarce. Given that active parents can create active families and social factors are important for parents’ decision making, the authors investigated a range of social influences on parents’ intentions to be physically active. Parents (N = 580; 288 mothers and 292 fathers) of children younger than 5 years completed an extended Theory of Planned Behavior questionnaire either online or paper based. For both genders, attitude, control factors, group norms, friend general support, and an active parent identity predicted intentions, with social pressure and family support further predicting mothers’ intentions and active others further predicting fathers’ intentions. Attention to these factors and those specific to the genders may improve parents’ intentions to be physically active, thus maximizing the benefits to their own health and the healthy lifestyle practices for other family members.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Donor retention is vital to blood collection agencies. Past research has highlighted the importance of early career behavior for long-term donor retention, yet research investigating the determinants of early donor behavior is scarce. Using an extended Theory of Planned Behavior (TPB), this study sought to identify the predictors of first-time blood donors' early career retention. STUDY DESIGN AND METHODS: First-time donors (n = 256) completed three surveys on blood donation. The standard TPB predictors and self-identity as a donor were assessed 3 weeks (Time 1) and at 4 months (Time 2) after an initial donation. Path analyses examined the utility of the extended TPB to predict redonation at 4 and 8 months after initial donation. RESULTS: The extended TPB provided a good fit to the data. Post-Time 1 and 2 behavior was consistently predicted by intention to redonate. Further, intention was predicted by attitudes, perceived control, and self-identity (Times 1 and 2). Donors' intentions to redonate at Time 1 were the strongest predictor of intention to donate at Time 2, while donors' behavior at Time 1 strengthened self-identity as a blood donor at Time 2. CONCLUSION: An extended TPB framework proved efficacious in revealing the determinants of first-time donor retention in an initial 8-month period. The results suggest that collection agencies should intervene to bolster donors' attitudes, perceived control, and identity as a donor during this crucial post–first donation period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genomic DNA obtained from patient whole blood samples is a key element for genomic research. Advantages and disadvantages, in terms of time-efficiency, cost-effectiveness and laboratory requirements, of procedures available to isolate nucleic acids need to be considered before choosing any particular method. These characteristics have not been fully evaluated for some laboratory techniques, such as the salting out method for DNA extraction, which has been excluded from comparison in different studies published to date. We compared three different protocols (a traditional salting out method, a modified salting out method and a commercially available kit method) to determine the most cost-effective and time-efficient method to extract DNA. We extracted genomic DNA from whole blood samples obtained from breast cancer patient volunteers and compared the results of the product obtained in terms of quantity (concentration of DNA extracted and DNA obtained per ml of blood used) and quality (260/280 ratio and polymerase chain reaction product amplification) of the obtained yield. On average, all three methods showed no statistically significant differences between the final result, but when we accounted for time and cost derived for each method, they showed very significant differences. The modified salting out method resulted in a seven- and twofold reduction in cost compared to the commercial kit and traditional salting out method, respectively and reduced time from 3 days to 1 hour compared to the traditional salting out method. This highlights a modified salting out method as a suitable choice to be used in laboratories and research centres, particularly when dealing with a large number of samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Results of an interlaboratory comparison on size characterization of SiO2 airborne nanoparticles using on-line and off-line measurement techniques are discussed. This study was performed in the framework of Technical Working Area (TWA) 34—“Properties of Nanoparticle Populations” of the Versailles Project on Advanced Materials and Standards (VAMAS) in the project no. 3 “Techniques for characterizing size distribution of airborne nanoparticles”. Two types of nano-aerosols, consisting of (1) one population of nanoparticles with a mean diameter between 30.3 and 39.0 nm and (2) two populations of non-agglomerated nanoparticles with mean diameters between, respectively, 36.2–46.6 nm and 80.2–89.8 nm, were generated for characterization measurements. Scanning mobility particle size spectrometers (SMPS) were used for on-line measurements of size distributions of the produced nano-aerosols. Transmission electron microscopy, scanning electron microscopy, and atomic force microscopy were used as off-line measurement techniques for nanoparticles characterization. Samples were deposited on appropriate supports such as grids, filters, and mica plates by electrostatic precipitation and a filtration technique using SMPS controlled generation upstream. The results of the main size distribution parameters (mean and mode diameters), obtained from several laboratories, were compared based on metrological approaches including metrological traceability, calibration, and evaluation of the measurement uncertainty. Internationally harmonized measurement procedures for airborne SiO2 nanoparticles characterization are proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic sensors can be used to estimate species richness for vocal species such as birds. They can continuously and passively record large volumes of data over extended periods. These data must subsequently be analyzed to detect the presence of vocal species. Automated analysis of acoustic data for large numbers of species is complex and can be subject to high levels of false positive and false negative results. Manual analysis by experienced surveyors can produce accurate results; however the time and effort required to process even small volumes of data can make manual analysis prohibitive. This study examined the use of sampling methods to reduce the cost of analyzing large volumes of acoustic sensor data, while retaining high levels of species detection accuracy. Utilizing five days of manually analyzed acoustic sensor data from four sites, we examined a range of sampling frequencies and methods including random, stratified, and biologically informed. We found that randomly selecting 120 one-minute samples from the three hours immediately following dawn over five days of recordings, detected the highest number of species. On average, this method detected 62% of total species from 120 one-minute samples, compared to 34% of total species detected from traditional area search methods. Our results demonstrate that targeted sampling methods can provide an effective means for analyzing large volumes of acoustic sensor data efficiently and accurately. Development of automated and semi-automated techniques is required to assist in analyzing large volumes of acoustic sensor data. Read More: http://www.esajournals.org/doi/abs/10.1890/12-2088.1

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A significant amount of speech is typically required for speaker verification system development and evaluation, especially in the presence of large intersession variability. This paper introduces a source and utterance duration normalized linear discriminant analysis (SUN-LDA) approaches to compensate session variability in short-utterance i-vector speaker verification systems. Two variations of SUN-LDA are proposed where normalization techniques are used to capture source variation from both short and full-length development i-vectors, one based upon pooling (SUN-LDA-pooled) and the other on concatenation (SUN-LDA-concat) across the duration and source-dependent session variation. Both the SUN-LDA-pooled and SUN-LDA-concat techniques are shown to provide improvement over traditional LDA on NIST 08 truncated 10sec-10sec evaluation conditions, with the highest improvement obtained with the SUN-LDA-concat technique achieving a relative improvement of 8% in EER for mis-matched conditions and over 3% for matched conditions over traditional LDA approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A people-to-people matching system (or a match-making system) refers to a system in which users join with the objective of meeting other users with the common need. Some real-world examples of these systems are employer-employee (in job search networks), mentor-student (in university social networks), consume-to-consumer (in marketplaces) and male-female (in an online dating network). The network underlying in these systems consists of two groups of users, and the relationships between users need to be captured for developing an efficient match-making system. Most of the existing studies utilize information either about each of the users in isolation or their interaction separately, and develop recommender systems using the one form of information only. It is imperative to understand the linkages among the users in the network and use them in developing a match-making system. This study utilizes several social network analysis methods such as graph theory, small world phenomenon, centrality analysis, density analysis to gain insight into the entities and their relationships present in this network. This paper also proposes a new type of graph called “attributed bipartite graph”. By using these analyses and the proposed type of graph, an efficient hybrid recommender system is developed which generates recommendation for new users as well as shows improvement in accuracy over the baseline methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gold is often considered as an inert material but it has been unequivocally demonstrated that it possesses unique electronic, optical, catalytic and electrocatalytic properties when in a nanostructured form.[1] For the latter the electrochemical behaviour of gold in aqueous media has been widely studied on a plethora of gold samples, including bulk polycrystalline and single-crystal electrodes, nanoparticles, evaporated films as well as electrodeposited nanostructures, particles and thin films.[1b, 2] It is now well-established that the electrochemical behaviour of gold is not as simple as an extended double-layer charging region followed by a monolayer oxide-formation/-removal process. In fact the so-called double-layer region of gold is significantly more complicated and has been investigated with a variety of electrochemical and surface science techniques. Burke and others[3] have demonstrated that significant processes due to the oxidation of low lattice stabilised atoms or clusters of atoms occur in this region at thermally and electrochemically treated electrodes which were confirmed later by Bond[4] to be Faradaic in nature via large-amplitude Fourier transformed ac voltammetric experiments. Supporting evidence for the oxidation of gold in the double-layer region was provided by Bard,[5] who used a surface interrogation mode of scanning electrochemical microscopy to quantify the extent of this process that forms incipient oxides on the surface. These were estimated to be as high as 20% of a monolayer. This correlated with contact electrode resistance measurements,[6] capacitance measurements[7] and also electroreflection techniques...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The continuous growth of the XML data poses a great concern in the area of XML data management. The need for processing large amounts of XML data brings complications to many applications, such as information retrieval, data integration and many others. One way of simplifying this problem is to break the massive amount of data into smaller groups by application of clustering techniques. However, XML clustering is an intricate task that may involve the processing of both the structure and the content of XML data in order to identify similar XML data. This research presents four clustering methods, two methods utilizing the structure of XML documents and the other two utilizing both the structure and the content. The two structural clustering methods have different data models. One is based on a path model and other is based on a tree model. These methods employ rigid similarity measures which aim to identifying corresponding elements between documents with different or similar underlying structure. The two clustering methods that utilize both the structural and content information vary in terms of how the structure and content similarity are combined. One clustering method calculates the document similarity by using a linear weighting combination strategy of structure and content similarities. The content similarity in this clustering method is based on a semantic kernel. The other method calculates the distance between documents by a non-linear combination of the structure and content of XML documents using a semantic kernel. Empirical analysis shows that the structure-only clustering method based on the tree model is more scalable than the structure-only clustering method based on the path model as the tree similarity measure for the tree model does not need to visit the parents of an element many times. Experimental results also show that the clustering methods perform better with the inclusion of the content information on most test document collections. To further the research, the structural clustering method based on tree model is extended and employed in XML transformation. The results from the experiments show that the proposed transformation process is faster than the traditional transformation system that translates and converts the source XML documents sequentially. Also, the schema matching process of XML transformation produces a better matching result in a shorter time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using cooperative learning in classrooms promotes academic achievement, communication skills, problem-solving, social skills and student motivation. Yet it is reported that cooperative learning as a Western educational concept may be ineffective in Asian cultural contexts. The study aims to investigate the utilisation of scaffolding techniques for cooperative learning in Thailand primary mathematics classes. A teacher training program was designed to foster Thai primary school teachers’ cooperative learning implementation. Two teachers participated in this experimental program for one and a half weeks and then implemented cooperative learning strategies in their mathematics classes for six weeks. The data collected from teacher interviews and classroom observations indicates that the difficulty or failure of implementing cooperative learning in Thailand education may not be directly derived from cultural differences. Instead, it does indicate that Thai culture can be constructively merged with cooperative learning through a teacher training program and practices of scaffolding techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extended theory of planned behavior (TPB) was used to understand the factors, particularly control perceptions and affective reactions, given conflicting findings in previous research, informing younger people's intentions to join a bone marrow registry. Participants (N  = 174) completed attitude, subjective norm, perceived behavioral control (PBC), moral norm, anticipated regret, self-identity, and intention items for registering. The extended TPB (except PBC) explained 67.2% of variance in intention. Further testing is needed as to the volitional nature of registering. Moral norm, anticipated regret, and self-identity are likely intervention targets for increasing younger people's bone marrow registry participation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the critical shortage and continued need of blood and organ donations (ODs), research exploring similarities and differences in the motivational determinants of these behaviors is needed. In a sample of 258 university students, we used a cross-sectional design to test the utility of an extended theory of planned behavior (TPB) including moral norm, self-identity and in-group altruism (family/close friends and ethnic group), to predict people’s blood and OD intentions. Overall, the extended TPB explained 77.0% and 74.6% of variance in blood and OD intentions, respectively. In regression analyses, common contributors to intentions across donation contexts were attitude, self-efficacy and self-identity. Normative influences varied with subjective norm as a significant predictor related to OD intentions but not blood donation intentions at the final step of regression analyses. Moral norm did not contribute significantly to blood or OD intentions. In-group altruism (family/close friends) was significantly related to OD intentions only in regressions. Future donation strategies should increase confidence to donate, foster a perception of self as the type of person who donates blood and/or organs, and address preferences to donate organs to in-group members only.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Type 2 diabetes remains an escalating world-wide problem, despite a range of treatments. The revelation that insulin secretion is under the control of a gut hormone, glucagon-like peptide 1 (GLP-1) led to a new paradigm in the management of type 2 diabetes, medicines that directly stimulate, or that prolong the actions of the endogenous GLP-1, at its receptors. Exenatide is an agonist at the GLP-1 receptors, and was initially developed as a subcutaneous twice daily medication, ExBID. The clinical trials with ExBID established a role for exenatide in the treatment of type 2 diabetes. Subsequently, once weekly exenatide (ExQW) was shown to have advantages over ExBID, and there is now more emphasis on the development of ExQW. ExQW alone reduces glycosylated haemoglobin (HbA1c) and body weight, and is well tolerated. ExQW has been compared to sitagliptin, pioglitazone and metformin, and shown to have a greater ability to reduce HbA1c than these other medicines. The only preparation of insulin, which ExQW has been compared to, is insulin glargine, and the ExQW has some favourable properties in this comparison, notably causing weight loss, compared to the gain with insulin glargine. ExQW has been compared to another GLP-1 receptor agonist, liraglutide, and ExQW is non-inferior to liraglutide in reducing HbA1c. The small amount of evidence available, shows that subjects with type 2 diabetes, prefer ExQW to ExBID, and that adherence was high to these in the clinical trial setting. Healthcare and economic modelling suggests that ExQW will reduce diabetic complications and be cost-effective, compared to other medications, with long term use. Little is known about whether subjects with type 2 diabetes prefer ExQW to other medicines, and whether adherence is good to ExQW in practice, and these important topics require further study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airport efficiency is important because it has a direct impact on customer safety and satisfaction and therefore the financial performance and sustainability of airports, airlines, and affiliated service providers. This is especially so in a world characterized by an increasing volume of both domestic and international air travel, price and other forms of competition between rival airports, airport hubs and airlines, and rapid and sometimes unexpected changes in airline routes and carriers. It also reflects expansion in the number of airports handling regional, national, and international traffic and the growth of complementary airport facilities including industrial, commercial, and retail premises. This has fostered a steadily increasing volume of research aimed at modeling and providing best-practice measures and estimates of airport efficiency using mathematical and econometric frontiers. The purpose of this chapter is to review these various methods as they apply to airports throughout the world. Apart from discussing the strengths and weaknesses of the different approaches and their key findings, the paper also examines the steps faced by researchers as they move through the modeling process in defining airport inputs and outputs and the purported efficiency drivers. Accordingly, the chapter provides guidance to those conducting empirical research on airport efficiency and serves as an aid for aviation regulators and airport operators among others interpreting airport efficiency research outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic sensors are increasingly used to monitor biodiversity. They can remain deployed in the environment for extended periods to passively and objectively record the sounds of the environment. The collected acoustic data must be analyzed to identify the presence of the sounds made by fauna in order to understand biodiversity. Citizen scientists play an important role in analyzing this data by annotating calls and identifying species. This paper presents our research into bioacoustic annotation techniques. It describes our work in defining a process for managing, creating, and using tags that are applied to our annotations. This paper includes a detailed description of our methodology for correcting and then linking our folksonomic tags to taxonomic data sources. Providing tools and processes for maintaining species naming consistency is critical to the success of a project designed to generate scientific data. We demonstrate that cleaning the folksonomic data and providing links to external taxonomic authorities enhances the scientific utility of the tagging efforts of citizen scientists.