885 resultados para 10 Technology
Resumo:
Introduction Behavioural interventions have been shown to improve outcomes in patients with type 1 diabetes mellitus (T1DM). There are a small number of studies that suggest text-messages (TM), native mobile applications (NMAs), and other mobile tools may be useful platforms for delivering behavioural interventions to adolescents. Aim The aim of this study was to explore, by way of a systematic review of available literature, (a) the outcomes of interventions using mobile technology for youth with T1DM and (b) what mobile technologies, functional design elements and aesthetic design elements have the best evidence to support their use. Methods A search of six online databases returned 196 unique results, of which 13 met the inclusion criteria. Results Four studies were randomised controlled trials (RCTs), and all others prospective cohort studies. TM (10) was the most common intervention technology, while NMAs were used in four studies. The most common outcome measured was HbA1c (9); however, only three studies showed a significant decrease. Similarly, the results reported for other outcome measures were mixed. The studies included in this review suggest that interventions which have data collection and clinician support functionality may be more effective in improving adherence and glycaemic control, but more evidence is needed. Further, the evidence base supporting the use of NMAs in T1DM management for adolescents is weak, with most studies adopting TM as the intervention tool. Overall, the studies lack adequate descriptions of their methodology, and better quality studies are required to inform future intervention design.
Resumo:
Phage display is an advanced technology that can be used to characterize the interactions of antibody with antigen at the molecular level. It provides valuable data when applied to the investigation of IgE interaction with allergens. The aim of this rostrum article is to provide an explanation of the potential of phage display for increasing the understanding of allergen- IgE interaction, the discovery of diagnostic reagents, and the development of novel therapeutics for the treatment of allergic disease. The significance of initial studies that have applied phage display technology in allergy research will be highlighted. Phage display has been used to clone human IgE to timothy grass pollen allergen Phl p 5, to characterize the epitopes for murine and human antibodies to a birch pollen allergen Bet v 1, and to elucidate the epitopes of a murine mAb to the house dust mite allergen Der p 1. The technology has identified peptides that functionally mimic sites of human IgE constant domains and that were used to raise antiserum for blocking binding of IgE to the FcεRI on basophils and subsequent release of histamine. Phage display has also been used to characterize novel peanut and fungal allergens. The method has been used to increase our understanding of the molecular basis of allergen-IgE interactions and to develop clinically relevant reagents with the pharmacologic potential to block the effector phase of allergic reactions. Many advances from these early studies are likely as phage display technology evolves and allergists gain expertise in its research applications.
Resumo:
Purpose This paper aims to use the Model of Goal-Directed Behavior (MGB) to examine the factors affecting consumers’ continued use of emerging technology-based self-services (TBSSs) with credence qualities. Professional services, which traditionally require specialized knowledge and high levels of interpersonal interaction to produce owing to their credence qualities, are increasingly delivered via self-service technologies. Health services delivered via mobile devices, for example, facilitate self-care without direct involvement from health professionals. Design/methodology/approach A mental health service delivered via the Internet and mobile phone, myCompass, was selected as the research context. Twenty interviews were conducted with users of myCompass and the data were thematically analyzed. Findings The findings of the study showcase the unique determinants of consumers’ continued use of TBSSs with credence qualities relative to the more routine services which have been the focus of extant research. The findings further provide support for the utility of the MGB in explaining service continuance, although the importance of distinguishing between extrinsic and intrinsic motivational components of behavioral desire and capturing the impact of social influence beyond subjective norms is also highlighted. Originality/value This study contributes to recent research examining differences in consumer responses across TBSSs and behavioral loyalty to these services. It also provides empirical evidence for broadening and deepening the MGB within this behavioral domain.
Resumo:
This paper is a qualitative, practice based study describing the use of the Focus-Action-Reflection (FAR) Guide (Harrison and Treagust, 2000) to address the shortcomings of a pedagogical analogical model in Year 10 Science. The aim of this paper is to present my experience of the FAR Guide in relation to an analogical model that gave rise to perceived shortcomings by both teachers and students. This study found the FAR Guide to be a highly valuable tool, transforming the presentation of the analogical model, and enabling students to develop a deeper understanding of the nature of scientific knowledge.
Resumo:
Fluid bed granulation is a key pharmaceutical process which improves many of the powder properties for tablet compression. Dry mixing, wetting and drying phases are included in the fluid bed granulation process. Granules of high quality can be obtained by understanding and controlling the critical process parameters by timely measurements. Physical process measurements and particle size data of a fluid bed granulator that are analysed in an integrated manner are included in process analytical technologies (PAT). Recent regulatory guidelines strongly encourage the pharmaceutical industry to apply scientific and risk management approaches to the development of a product and its manufacturing process. The aim of this study was to utilise PAT tools to increase the process understanding of fluid bed granulation and drying. Inlet air humidity levels and granulation liquid feed affect powder moisture during fluid bed granulation. Moisture influences on many process, granule and tablet qualities. The approach in this thesis was to identify sources of variation that are mainly related to moisture. The aim was to determine correlations and relationships, and utilise the PAT and design space concepts for the fluid bed granulation and drying. Monitoring the material behaviour in a fluidised bed has traditionally relied on the observational ability and experience of an operator. There has been a lack of good criteria for characterising material behaviour during spraying and drying phases, even though the entire performance of a process and end product quality are dependent on it. The granules were produced in an instrumented bench-scale Glatt WSG5 fluid bed granulator. The effect of inlet air humidity and granulation liquid feed on the temperature measurements at different locations of a fluid bed granulator system were determined. This revealed dynamic changes in the measurements and enabled finding the most optimal sites for process control. The moisture originating from the granulation liquid and inlet air affected the temperature of the mass and pressure difference over granules. Moreover, the effects of inlet air humidity and granulation liquid feed rate on granule size were evaluated and compensatory techniques used to optimize particle size. Various end-point indication techniques of drying were compared. The ∆T method, which is based on thermodynamic principles, eliminated the effects of humidity variations and resulted in the most precise estimation of the drying end-point. The influence of fluidisation behaviour on drying end-point detection was determined. The feasibility of the ∆T method and thus the similarities of end-point moisture contents were found to be dependent on the variation in fluidisation between manufacturing batches. A novel parameter that describes behaviour of material in a fluid bed was developed. Flow rate of the process air and turbine fan speed were used to calculate this parameter and it was compared to the fluidisation behaviour and the particle size results. The design space process trajectories for smooth fluidisation based on the fluidisation parameters were determined. With this design space it is possible to avoid excessive fluidisation and improper fluidisation and bed collapse. Furthermore, various process phenomena and failure modes were observed with the in-line particle size analyser. Both rapid increase and a decrease in granule size could be monitored in a timely manner. The fluidisation parameter and the pressure difference over filters were also discovered to express particle size when the granules had been formed. The various physical parameters evaluated in this thesis give valuable information of fluid bed process performance and increase the process understanding.
Resumo:
There is a need for better understanding of the processes and new ideas to develop traditional pharmaceutical powder manufacturing procedures. Process analytical technology (PAT) has been developed to improve understanding of the processes and establish methods to monitor and control processes. The interest is in maintaining and even improving the whole manufacturing process and the final products at real-time. Process understanding can be a foundation for innovation and continuous improvement in pharmaceutical development and manufacturing. New methods are craved for to increase the quality and safety of the final products faster and more efficiently than ever before. The real-time process monitoring demands tools, which enable fast and noninvasive measurements with sufficient accuracy. Traditional quality control methods have been laborious and time consuming and they are performed off line i.e. the analysis has been removed from process area. Vibrational spectroscopic methods are responding this challenge and their utilisation have increased a lot during the past few years. In addition, other methods such as colour analysis can be utilised in noninvasive real-time process monitoring. In this study three pharmaceutical processes were investigated: drying, mixing and tabletting. In addition tablet properties were evaluated. Real-time monitoring was performed with NIR and Raman spectroscopies, colour analysis, particle size analysis and compression data during tabletting was evaluated using mathematical modelling. These methods were suitable for real-time monitoring of pharmaceutical unit operations and increase the knowledge of the critical parameters in the processes and the phenomena occurring during operations. They can improve our process understanding and therefore, finally, enhance the quality of final products.
Resumo:
This study extends understanding of consumers' decisions to adopt transformative services delivered via technology. It incorporates competitive effects into the model of goal-directed behavior which, in keeping with the majority of consumer decision making models, neglects to explicitly account for competition. A goal-level operationalization of competition, incorporating both direct and indirect competition, is proposed. A national web-based survey collected data from 431 respondents about their decisions to adopt mental health services delivered via mobile phone. The findings show that the extent to which consumers perceived using these transformative services to be more instrumental to achieving their goals than competition had the greatest impact on their adoption decisions. This finding builds on the limited empirical evidence for the inclusion of competitive effects to more fully explain consumers' decisions to adopt technology-based and other services. It also provides support for a broader operationalization of competition with respect to consumers' personal goals.
Resumo:
The article describes a generalized estimating equations approach that was used to investigate the impact of technology on vessel performance in a trawl fishery during 1988-96, while accounting for spatial and temporal correlations in the catch-effort data. Robust estimation of parameters in the presence of several levels of clustering depended more on the choice of cluster definition than on the choice of correlation structure within the cluster. Models with smaller cluster sizes produced stable results, while models with larger cluster sizes, that may have had complex within-cluster correlation structures and that had within-cluster covariates, produced estimates sensitive to the correlation structure. The preferred model arising from this dataset assumed that catches from a vessel were correlated in the same years and the same areas, but independent in different years and areas. The model that assumed catches from a vessel were correlated in all years and areas, equivalent to a random effects term for vessel, produced spurious results. This was an unexpected finding that highlighted the need to adopt a systematic strategy for modelling. The article proposes a modelling strategy of selecting the best cluster definition first, and the working correlation structure (within clusters) second. The article discusses the selection and interpretation of the model in the light of background knowledge of the data and utility of the model, and the potential for this modelling approach to apply in similar statistical situations.
Resumo:
A panel of 19 monoclonal antibodies (mAbs) was used to study the immunological variability of Lettuce mosaic virus (LMV), a member of the genus Potyvirus, and to perform a first epitope characterization of this virus. Based on their specificity of recognition against a panel of 15 LMV isolates, the mAbs could be clustered in seven reactivity groups. Surface plasmon resonance analysis indicated the presence, on the LMV particles, of at least five independent recognition/binding regions, correlating with the seven mAbs reactivity groups. The results demonstrate that LMV shows significant serological variability and shed light on the LMV epitope structure. The various mAbs should prove a new and efficient tool for LMV diagnostic and field epidemiology studies.
Resumo:
The integration of technology in care is core business in nursing and this role requires that we must understand and use technology informed by evidence that goes much deeper and broader than actions and behaviours. We need to delve more deeply into its complexity because there is nothing minor or insignificant about technology as a major influence in healthcare outcomes and experiences. Evidence is needed that addresses technology and nursing from perspectives that examine the effects of technology, especially related to increasing demands for efficiency, the relationship of technology to nursing and caring, and a range of philosophical questions associated with empowering people in their healthcare choices. Specifically, there is a need to confront in practice the ways technique influences care. Technique is the creation of a kind of thinking that is necessary for contemporary healthcare technology to develop and be applied in an efficient and rational manner. Technique is not an entity or specific thing, but rather a way of thinking that seeks to shape and organize nursing activity, and manage efficiently individual difference(s) in care. It emphasizes predetermined causal relationships, conformity, and sameness of product, process, and thought. In response is needed a radical vision of nursing that attempts in a real sense to ensure we meet the needs of individuals and their community. Activism and advocacy are needed, and a willingness to create a certain detachment from the imperatives that technique demands. It is argued that our responsibility as nurses is to respond in practice to the errors, advantages, difficulties, and temptations of technology for the benefit of those who most need our assistance and care.
Resumo:
Public-private partnerships (PPPs) have generated a lot of interest from governments around the world for leveraging private sector involvement in developing and sustaining public infrastructure and services. Initially, PPPs were favoured by transport, energy, and other large infrastructure-intensive sectors. More recently, the concept has been expanded to include social sectors such as education.
Resumo:
Strategies of scientific, question-driven inquiry are stated to be important cultural practices that should be educated in schools and universities. The present study focuses on investigating multiple efforts to implement a model of Progressive Inquiry and related Web-based tools in primary, secondary and university level education, to develop guidelines for educators in promoting students collaborative inquiry practices with technology. The research consists of four studies. In Study I, the aims were to investigate how a human tutor contributed to the university students collaborative inquiry process through virtual forums, and how the influence of the tutoring activities is demonstrated in the students inquiry discourse. Study II examined an effort to implement technology-enhanced progressive inquiry as a distance working project in a middle school context. Study III examined multiple teachers' methods of organizing progressive inquiry projects in primary and secondary classrooms through a generic analysis framework. In Study IV, a design-based research effort consisting of four consecutive university courses, applying progressive inquiry pedagogy, was retrospectively re-analyzed in order to develop the generic design framework. The results indicate that appropriate teacher support for students collaborative inquiry efforts appears to include interplay between spontaneity and structure. Careful consideration should be given to content mastery, critical working strategies or essential knowledge practices that the inquiry approach is intended to promote. In particular, those elements in students activities should be structured and directed, which are central to the aim of Progressive Inquiry, but which the students do not recognize or demonstrate spontaneously, and which are usually not taken into account in existing pedagogical methods or educational conventions. Such elements are, e.g., productive co-construction activities; sustained engagement in improving produced ideas and explanations; critical reflection of the adopted inquiry practices, and sophisticated use of modern technology for knowledge work. Concerning the scaling-up of inquiry pedagogy, it was concluded that one individual teacher can also apply the principles of Progressive Inquiry in his or her own teaching in many innovative ways, even under various institutional constraints. The developed Pedagogical Infrastructure Framework enabled recognizing and examining some central features and their interplay in the designs of examined inquiry units. The framework may help to recognize and critically evaluate the invisible learning-cultural conventions in various educational settings and can mediate discussions about how to overcome or change them.
Resumo:
New Internet and Web-based technology applications have meant significant cost and time efficiencies to many American businesses. However, many employers have not yet fully grasped the impact of these new information and communication technologies on applicants and employees with certain disabilities such as vision impairments, hearing problems or limited dexterity. Although not all applicants and employees who have a disability may experience IT-access problems, to select groups it can pose a needless barrier. The increasing dominance of IT in the workplace presents both a challenge and an opportunity for workers with disabilities and their employers. It will be up to HR professionals to ensure that Web-based HR processes and workplace technologies are accessible to their employees with disabilities. .
Resumo:
Background: Sorghum genome mapping based on DNA markers began in the early 1990s and numerous genetic linkage maps of sorghum have been published in the last decade, based initially on RFLP markers with more recent maps including AFLPs and SSRs and very recently, Diversity Array Technology (DArT) markers. It is essential to integrate the rapidly growing body of genetic linkage data produced through DArT with the multiple genetic linkage maps for sorghum generated through other marker technologies. Here, we report on the colinearity of six independent sorghum component maps and on the integration of these component maps into a single reference resource that contains commonly utilized SSRs, AFLPs, and high-throughput DArT markers. Results: The six component maps were constructed using the MultiPoint software. The lengths of the resulting maps varied between 910 and 1528 cM. The order of the 498 markers that segregated in more than one population was highly consistent between the six individual mapping data sets. The framework consensus map was constructed using a "Neighbours" approach and contained 251 integrated bridge markers on the 10 sorghum chromosomes spanning 1355.4 cM with an average density of one marker every 5.4 cM, and were used for the projection of the remaining markers. In total, the sorghum consensus map consisted of a total of 1997 markers mapped to 2029 unique loci ( 1190 DArT loci and 839 other loci) spanning 1603.5 cM and with an average marker density of 1 marker/0.79 cM. In addition, 35 multicopy markers were identified. On average, each chromosome on the consensus map contained 203 markers of which 58.6% were DArT markers. Non-random patterns of DNA marker distribution were observed, with some clear marker-dense regions and some marker-rare regions. Conclusion: The final consensus map has allowed us to map a larger number of markers than possible in any individual map, to obtain a more complete coverage of the sorghum genome and to fill a number of gaps on individual maps. In addition to overall general consistency of marker order across individual component maps, good agreement in overall distances between common marker pairs across the component maps used in this study was determined, using a difference ratio calculation. The obtained consensus map can be used as a reference resource for genetic studies in different genetic backgrounds, in addition to providing a framework for transferring genetic information between different marker technologies and for integrating DArT markers with other genomic resources. DArT markers represent an affordable, high throughput marker system with great utility in molecular breeding programs, especially in crops such as sorghum where SNP arrays are not publicly available.
Resumo:
Objective Death certificates provide an invaluable source for cancer mortality statistics; however, this value can only be realised if accurate, quantitative data can be extracted from certificates – an aim hampered by both the volume and variable nature of certificates written in natural language. This paper proposes an automatic classification system for identifying cancer related causes of death from death certificates. Methods Detailed features, including terms, n-grams and SNOMED CT concepts were extracted from a collection of 447,336 death certificates. These features were used to train Support Vector Machine classifiers (one classifier for each cancer type). The classifiers were deployed in a cascaded architecture: the first level identified the presence of cancer (i.e., binary cancer/nocancer) and the second level identified the type of cancer (according to the ICD-10 classification system). A held-out test set was used to evaluate the effectiveness of the classifiers according to precision, recall and F-measure. In addition, detailed feature analysis was performed to reveal the characteristics of a successful cancer classification model. Results The system was highly effective at identifying cancer as the underlying cause of death (F-measure 0.94). The system was also effective at determining the type of cancer for common cancers (F-measure 0.7). Rare cancers, for which there was little training data, were difficult to classify accurately (F-measure 0.12). Factors influencing performance were the amount of training data and certain ambiguous cancers (e.g., those in the stomach region). The feature analysis revealed a combination of features were important for cancer type classification, with SNOMED CT concept and oncology specific morphology features proving the most valuable. Conclusion The system proposed in this study provides automatic identification and characterisation of cancers from large collections of free-text death certificates. This allows organisations such as Cancer Registries to monitor and report on cancer mortality in a timely and accurate manner. In addition, the methods and findings are generally applicable beyond cancer classification and to other sources of medical text besides death certificates.