388 resultados para Extended techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extended theory of planned behavior (TPB) was used to understand the factors, particularly control perceptions and affective reactions, given conflicting findings in previous research, informing younger people's intentions to join a bone marrow registry. Participants (N  = 174) completed attitude, subjective norm, perceived behavioral control (PBC), moral norm, anticipated regret, self-identity, and intention items for registering. The extended TPB (except PBC) explained 67.2% of variance in intention. Further testing is needed as to the volitional nature of registering. Moral norm, anticipated regret, and self-identity are likely intervention targets for increasing younger people's bone marrow registry participation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the critical shortage and continued need of blood and organ donations (ODs), research exploring similarities and differences in the motivational determinants of these behaviors is needed. In a sample of 258 university students, we used a cross-sectional design to test the utility of an extended theory of planned behavior (TPB) including moral norm, self-identity and in-group altruism (family/close friends and ethnic group), to predict people’s blood and OD intentions. Overall, the extended TPB explained 77.0% and 74.6% of variance in blood and OD intentions, respectively. In regression analyses, common contributors to intentions across donation contexts were attitude, self-efficacy and self-identity. Normative influences varied with subjective norm as a significant predictor related to OD intentions but not blood donation intentions at the final step of regression analyses. Moral norm did not contribute significantly to blood or OD intentions. In-group altruism (family/close friends) was significantly related to OD intentions only in regressions. Future donation strategies should increase confidence to donate, foster a perception of self as the type of person who donates blood and/or organs, and address preferences to donate organs to in-group members only.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Type 2 diabetes remains an escalating world-wide problem, despite a range of treatments. The revelation that insulin secretion is under the control of a gut hormone, glucagon-like peptide 1 (GLP-1) led to a new paradigm in the management of type 2 diabetes, medicines that directly stimulate, or that prolong the actions of the endogenous GLP-1, at its receptors. Exenatide is an agonist at the GLP-1 receptors, and was initially developed as a subcutaneous twice daily medication, ExBID. The clinical trials with ExBID established a role for exenatide in the treatment of type 2 diabetes. Subsequently, once weekly exenatide (ExQW) was shown to have advantages over ExBID, and there is now more emphasis on the development of ExQW. ExQW alone reduces glycosylated haemoglobin (HbA1c) and body weight, and is well tolerated. ExQW has been compared to sitagliptin, pioglitazone and metformin, and shown to have a greater ability to reduce HbA1c than these other medicines. The only preparation of insulin, which ExQW has been compared to, is insulin glargine, and the ExQW has some favourable properties in this comparison, notably causing weight loss, compared to the gain with insulin glargine. ExQW has been compared to another GLP-1 receptor agonist, liraglutide, and ExQW is non-inferior to liraglutide in reducing HbA1c. The small amount of evidence available, shows that subjects with type 2 diabetes, prefer ExQW to ExBID, and that adherence was high to these in the clinical trial setting. Healthcare and economic modelling suggests that ExQW will reduce diabetic complications and be cost-effective, compared to other medications, with long term use. Little is known about whether subjects with type 2 diabetes prefer ExQW to other medicines, and whether adherence is good to ExQW in practice, and these important topics require further study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airport efficiency is important because it has a direct impact on customer safety and satisfaction and therefore the financial performance and sustainability of airports, airlines, and affiliated service providers. This is especially so in a world characterized by an increasing volume of both domestic and international air travel, price and other forms of competition between rival airports, airport hubs and airlines, and rapid and sometimes unexpected changes in airline routes and carriers. It also reflects expansion in the number of airports handling regional, national, and international traffic and the growth of complementary airport facilities including industrial, commercial, and retail premises. This has fostered a steadily increasing volume of research aimed at modeling and providing best-practice measures and estimates of airport efficiency using mathematical and econometric frontiers. The purpose of this chapter is to review these various methods as they apply to airports throughout the world. Apart from discussing the strengths and weaknesses of the different approaches and their key findings, the paper also examines the steps faced by researchers as they move through the modeling process in defining airport inputs and outputs and the purported efficiency drivers. Accordingly, the chapter provides guidance to those conducting empirical research on airport efficiency and serves as an aid for aviation regulators and airport operators among others interpreting airport efficiency research outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic sensors are increasingly used to monitor biodiversity. They can remain deployed in the environment for extended periods to passively and objectively record the sounds of the environment. The collected acoustic data must be analyzed to identify the presence of the sounds made by fauna in order to understand biodiversity. Citizen scientists play an important role in analyzing this data by annotating calls and identifying species. This paper presents our research into bioacoustic annotation techniques. It describes our work in defining a process for managing, creating, and using tags that are applied to our annotations. This paper includes a detailed description of our methodology for correcting and then linking our folksonomic tags to taxonomic data sources. Providing tools and processes for maintaining species naming consistency is critical to the success of a project designed to generate scientific data. We demonstrate that cleaning the folksonomic data and providing links to external taxonomic authorities enhances the scientific utility of the tagging efforts of citizen scientists.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the field of rolling element bearing diagnostics envelope analysis, and in particular the squared envelope spectrum, have gained in the last years a leading role among the different digital signal processing techniques. The original constraint of constant operating speed has been relaxed thanks to the combination of this technique with the computed order tracking, able to resample signals at constant angular increments. In this way, the field of application of squared envelope spectrum has been extended to cases in which small speed fluctuations occur, maintaining the effectiveness and efficiency that characterize this successful technique. However, the constraint on speed has to be removed completely, making envelope analysis suitable also for speed and load transients, to implement an algorithm valid for all the industrial application. In fact, in many applications, the coincidence of high bearing loads, and therefore high diagnostic capability, with acceleration-deceleration phases represents a further incentive in this direction. This paper is aimed at providing and testing a procedure for the application of envelope analysis to speed transients. The effect of load variation on the proposed technique will be also qualitatively addressed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the field of diagnostics of rolling element bearings, the development of sophisticated techniques, such as Spectral Kurtosis and 2nd Order Cyclostationarity, extended the capability of expert users to identify not only the presence, but also the location of the damage in the bearing. Most of the signal-analysis methods, as the ones previously mentioned, result in a spectrum-like diagram that presents line frequencies or peaks in the neighbourhood of some theoretical characteristic frequencies, in case of damage. These frequencies depend only on damage position, bearing geometry and rotational speed. The major improvement in this field would be the development of algorithms with high degree of automation. This paper aims at this important objective, by discussing for the first time how these peaks can draw away from the theoretical expected frequencies as a function of different working conditions, i.e. speed, torque and lubrication. After providing a brief description of the peak-patterns associated with each type of damage, this paper shows the typical magnitudes of the deviations from the theoretical expected frequencies. The last part of the study presents some remarks about increasing the reliability of the automatic algorithm. The research is based on experimental data obtained by using artificially damaged bearings installed in a gearbox.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the field of rolling element bearing diagnostics, envelope analysis has gained in the last years a leading role among the different digital signal processing techniques. The original constraint of constant operating speed has been relaxed thanks to the combination of this technique with the computed order tracking, able to resample signals at constant angular increments. In this way, the field of application of this technique has been extended to cases in which small speed fluctuations occur, maintaining high effectiveness and efficiency. In order to make this algorithm suitable to all industrial applications, the constraint on speed has to be removed completely. In fact, in many applications, the coincidence of high bearing loads, and therefore high diagnostic capability, with acceleration-deceleration phases represents a further incentive in this direction. This chapter presents a procedure for the application of envelope analysis to speed transients. The effect of load variation on the proposed technique will be also qualitatively addressed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a comparative study on the response of a buried tunnel to surface blast using the arbitrary Lagrangian-Eulerian (ALE) and smooth particle hydrodynamics (SPH) techniques. Since explosive tests with real physical models are extremely risky and expensive, the results of a centrifuge test were used to validate the numerical techniques. The numerical study shows that the ALE predictions were faster and closer to the experimental results than those from the SPH simulations which over predicted the strains. The findings of this research demonstrate the superiority of the ALE modelling techniques for the present study. They also provide a comprehensive understanding of the preferred ALE modelling techniques which can be used to investigate the surface blast response of underground tunnels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes techniques to improve the performance of i-vector based speaker verification systems when only short utterances are available. Short-length utterance i-vectors vary with speaker, session variations, and the phonetic content of the utterance. Well established methods such as linear discriminant analysis (LDA), source-normalized LDA (SN-LDA) and within-class covariance normalisation (WCCN) exist for compensating the session variation but we have identified the variability introduced by phonetic content due to utterance variation as an additional source of degradation when short-duration utterances are used. To compensate for utterance variations in short i-vector speaker verification systems using cosine similarity scoring (CSS), we have introduced a short utterance variance normalization (SUVN) technique and a short utterance variance (SUV) modelling approach at the i-vector feature level. A combination of SUVN with LDA and SN-LDA is proposed to compensate the session and utterance variations and is shown to provide improvement in performance over the traditional approach of using LDA and/or SN-LDA followed by WCCN. An alternative approach is also introduced using probabilistic linear discriminant analysis (PLDA) approach to directly model the SUV. The combination of SUVN, LDA and SN-LDA followed by SUV PLDA modelling provides an improvement over the baseline PLDA approach. We also show that for this combination of techniques, the utterance variation information needs to be artificially added to full-length i-vectors for PLDA modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pile foundations transfer loads from superstructures to stronger sub soil. Their strength and stability can hence affect structural safety. This paper treats the response of reinforced concrete pile in saturated sand to a buried explosion. Fully coupled computer simulation techniques are used together with five different material models. Influence of reinforcement on pile response is investigated and important safety parameters of horizontal deformations and tensile stresses in the pile are evaluated. Results indicate that adequate longitudinal reinforcement and proper detailing of transverse reinforcement can reduce pile damage. Present findings can serve as a benchmark reference for future analysis and design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrocatalytic processes will undoubtedly be at the heart of energising future transportation and technology with the added importance of being able to create the necessary fuels required to do so in an environmentally friendly and cost effective manner. For this to be successful two almost mutually exclusive surface properties need to be reconciled, namely producing highly active/reactive surface sites that exhibit long term stability. This article reviews the various approaches which have been undertaken to study the elusive nature of these active sites on metal surfaces which are considered as adatoms or clusters of adatoms with low coordination number. This includes the pioneering studies at extended well defined stepped single crystal surfaces using cyclic voltammetry up to the highly sophisticated in situ electrochemical imaging techniques used to study chemically synthesised nanomaterials. By combining the information attained from single crystal surfaces, individual nanoparticles of defined size and shape, density functional theory calculations and new concepts such as mesoporous multimetallic thin films and single atom electrocatalysts new insights into the design and fabrication of materials with highly active but stable active sites can be achieved. The area of electrocatalysis is therefore not only a fascinating and exciting field in terms of realistic technological and economical benefits but also from the fundamental understanding that can be acquired by studying such an array of interesting materials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rigid lenses, which were originally made from glass (between 1888 and 1940) and later from polymethyl methacrylate or silicone acrylate materials, are uncomfortable to wear and are now seldom fitted to new patients. Contact lenses became a popular mode of ophthalmic refractive error correction following the discovery of the first hydrogel material – hydroxyethyl methacrylate – by Czech chemist Otto Wichterle in 1960. To satisfy the requirements for ocular biocompatibility, contact lenses must be transparent and optically stable (for clear vision), have a low elastic modulus (for good comfort), have a hydrophilic surface (for good wettability), and be permeable to certain metabolites, especially oxygen, to allow for normal corneal metabolism and respiration during lens wear. A major breakthrough in respect of the last of these requirements was the development of silicone hydrogel soft lenses in 1999 and techniques for making the surface hydrophilic. The vast majority of contact lenses distributed worldwide are mass-produced using cast molding, although spin casting is also used. These advanced mass-production techniques have facilitated the frequent disposal of contact lenses, leading to improvements in ocular health and fewer complications. More than one-third of all soft contact lenses sold today are designed to be discarded daily (i.e., ‘daily disposable’ lenses).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extended the previous work of Moss, O’Connor and White, to include a measure of group norms within the theory of planned behaviour (TPB), to examine the influences on students’ decisions to use lecture podcasts as part of their learning. Participants (N = 90) completed the extended TPB predictors before semester began (Time 1) and mid-semester (Time 2) and reported on their podcast use at mid-semester (Time 2) and end of semester (Time 3). We found that attitudes and perceived social pressures were important in informing intentions at both time points. At Time 1, perceptions of control over performing the behaviour and, at Time 2, perceptions of whether podcast use was normative among fellow students (group norms) also predicted intended podcast use. Intentions to use podcasting predicted self-reported use at both Time 2 and Time 3. These results provide important applied information for educators to encourage student use of novel on-line educational tools.