810 resultados para Technology acceptance model
Resumo:
Sub-Saharan Africa in general and Ghana in particular, missed out on the Green revolution. Efforts are being made to re-introduce the revolution, and this calls for more socio-economic research into the factors influencing the adoption of new technologies, hence, this study. The study sought to find out how socio-economic factors contribute to adoption of Green revolution technology in Ghana. The method of analysis involved a maximum likelihood estimation of a probit model. The proportion of Green revolution inputs was found to be greater for the following: households whose heads had formal education, households with higher levels of non-farm income, credit and labor supply as well as those living in urban centers. It is recommended that levels of complementary inputs such as credit, extension services and infrastructure are increased. Also, households must be encouraged to form farmer-groups as an important source of farm labor. Furthermore, the fundamental problems of illiteracy must be addressed through increasing the levels of formal and non-formal education; and the gap between the rural and urban centers must be bridged through infrastructural and rural development. However, care must be taken to ensure that small-scale farmers are not marginalized, in terms of access to these complementary inputs that go with effective adoption of new technology. With these policies well implemented, Ghana can catch up with her Asian counterparts in this re-introduction of the revolution.
Resumo:
Iatrogenic errors and patient safety in clinical processes are an increasing concern. The quality of process information in hardcopy or electronic form can heavily influence clinical behaviour and decision making errors. Little work has been undertaken to assess the safety impact of clinical process planning documents guiding the clinical actions and decisions. This paper investigates the clinical process documents used in elective surgery and their impact on latent and active clinical errors. Eight clinicians from a large health trust underwent extensive semi- structured interviews to understand their use of clinical documents, and their perceived impact on errors and patient safety. Samples of the key types of document used were analysed. Theories of latent organisational and active errors from the literature were combined with the EDA semiotics model of behaviour and decision making to propose the EDA Error Model. This model enabled us to identify perceptual, evaluation, knowledge and action error types and approaches to reducing their causes. The EDA error model was then used to analyse sample documents and identify error sources and controls. Types of knowledge artefact structures used in the documents were identified and assessed in terms of safety impact. This approach was combined with analysis of the questionnaire findings using existing error knowledge from the literature. The results identified a number of document and knowledge artefact issues that give rise to latent and active errors and also issues concerning medical culture and teamwork together with recommendations for further work.
Resumo:
Developments in high-throughput genotyping provide an opportunity to explore the application of marker technology in distinctness, uniformity and stability (DUS) testing of new varieties. We have used a large set of molecular markers to assess the feasibility of a UPOV Model 2 approach: “Calibration of threshold levels for molecular characteristics against the minimum distance in traditional characteristics”. We have examined 431 winter and spring barley varieties, with data from UK DUS trials comprising 28 characteristics, together with genotype data from 3072 SNP markers. Inter varietal distances were calculated and we found higher correlations between molecular and morphological distances than have been previously reported. When varieties were grouped by kinship, phenotypic and genotypic distances of these groups correlated well. We estimated the minimum marker numbers required and showed there was a ceiling after which the correlations do not improve. To investigate the possibility of breaking through this ceiling, we attempted genomic prediction of phenotypes from genotypes and higher correlations were achieved. We tested distinctness decisions made using either morphological or genotypic distances and found poor correspondence between each method.
Resumo:
This chapter presents techniques used for the generation of 3D digital elevation models (DEMs) from remotely sensed data. Three methods are explored and discussed—optical stereoscopic imagery, Interferometric Synthetic Aperture Radar (InSAR), and LIght Detection and Ranging (LIDAR). For each approach, the state-of-the-art presented in the literature is reviewed. Techniques involved in DEM generation are presented with accuracy evaluation. Results of DEMs reconstructed from remotely sensed data are illustrated. While the processes of DEM generation from satellite stereoscopic imagery represents a good example of passive, multi-view imaging technology, discussed in Chap. 2 of this book, InSAR and LIDAR use different principles to acquire 3D information. With regard to InSAR and LIDAR, detailed discussions are conducted in order to convey the fundamentals of both technologies.
Resumo:
The term ecosystem has been used to describe complex interactions between living organisms and the physical world. The principles underlying ecosystems can also be applied to complex human interactions in the digital world. As internet technologies make an increasing contribution to teaching and learning practice in higher education, the principles of digital ecosystems may help us understand how to maximise technology to benefit active, self-regulated learning especially among groups of learners. Here, feedback on student learning is presented within a conceptual digital ecosystems model of learning. Additionally, we have developed a Web 2.0-based system, called ASSET, which incorporates multimedia and social networking features to deliver assessment feedback within the functionality of the digital ecosystems model. Both the digital ecosystems model and the ASSET system are described and their implications for enhancing feedback on student learning are discussed.
Resumo:
This article describes a case study involving information technology managers and their new programmer recruitment policy, but the primary interest is methodological. The processes of issue generation and selection and model conceptualization are described. Early use of “magnetic hexagons” allowed the generation of a range of issues, most of which would not have emerged if system dynamics elicitation techniques had been employed. With the selection of a specific issue, flow diagraming was used to conceptualize a model, computer implementation and scenario generation following naturally. Observations are made on the processes of system dynamics modeling, particularly on the need to employ general techniques of knowledge elicitation in the early stages of interventions. It is proposed that flexible approaches should be used to generate, select, and study the issues, since these reduce any biasing of the elicitation toward system dynamics problems and also allow the participants to take up the most appropriate problem- structuring approach.
Resumo:
A model based on graph isomorphisms is used to formalize software evolution. Step by step we narrow the search space by an informed selection of the attributes based on the current state-of-the-art in software engineering and generate a seed solution. We then traverse the resulting space using graph isomorphisms and other set operations over the vertex sets. The new solutions will preserve the desired attributes. The goal of defining an isomorphism based search mechanism is to construct predictors of evolution that can facilitate the automation of ’software factory’ paradigm. The model allows for automation via software tools implementing the concepts.
Resumo:
A model based on graph isomorphisms is used to formalize software evolution. Step by step we narrow the search space by an informed selection of the attributes based on the current state-of-the-art in software engineering and generate a seed solution. We then traverse the resulting space using graph isomorphisms and other set operations over the vertex sets. The new solutions will preserve the desired attributes. The goal of defining an isomorphism based search mechanism is to construct predictors of evolution that can facilitate the automation of ’software factory’ paradigm. The model allows for automation via software tools implementing the concepts.
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
Customers will not continue to pay for a service if it is perceived to be of poor quality, and/or of no value. With a paradigm shift towards business dependence on service orientated IS solutions [1], it is critical that alignment exists between service definition, delivery, and customer expectation, businesses are to ensure customer satisfaction. Services, and micro-service development, offer businesses a flexible structure for solution innovation, however, constant changes in technology, business and societal expectations means an iterative analysis solution is required to i) determine whether provider services adequately meet customer segment needs and expectations, and ii) to help guide business service innovation and development. In this paper, by incorporating multiple models, we propose a series of steps to help identify and prioritise service gaps. Moreover, the authors propose the Dual Semiosis Analysis Model, i.e. a tool that highlights where within the symbiotic customer / provider semiosis process, requirements misinterpretation, and/or service provision deficiencies occur. This paper offers the reader a powerful customer-centric tool, designed to help business managers highlight both what services are critical to customer quality perception, and where future innovation
Resumo:
Cities globally are in the midst of taking action to reduce greenhouse gas (GHG) emissions. After the vital step of emissions quantification, strategies must be developed to detail how emissions reductions targets will be achieved. The Pathways to Urban Reductions in Greenhouse Gas Emissions (PURGE) model allows the estimation of emissions from four pertinent urban sectors: electricity generation, buildings, private transportation, and waste. Additionally, the carbon storage from urban and regional forests is modeled. An emissions scenario is examined for a case study of the greater Toronto, Ontario, Canada, area using data on current technology stocks and government projections for stock change. The scenario presented suggests that even with some aggressive targets for technological adoption (especially in the transportation sector), it will be difficult to achieve the less ambitious 2050 emissions reduction goals of the Intergovernmental Panel on Climate Change. This is largely attributable to the long life of the building stock and limitations of current retrofit practices. Additionally, demand reduction (through transportation mode shifting and building occupant behavior) will be an important component of future emissions cuts.
Resumo:
A manufactured aeration and nanofiltration MBR greywater system was tested during continuous operation at the University of Reading, to demonstrate reliability in delivery of high quality treated greywater. Its treatment performance was evaluated against British Standard criteria [BSI (Greywater Systems—Part 1 Code of Practice: BS8525-1:2010. BS Press, 2010); (Greywater Systems—Part 2 Domestic Greywater Treatment, Requirements and Methods: BS 8525-2:2011. BS Press, 2011)]. The low carbon greywater recycling technology produced excellent analytical results as well as consistency in performance. User acceptance of such reliably treated greywater was then evaluated through user perception studies. The results inform the potential supply of treated greywater to student accommodation. Out of 135 questionnaire replies, 95% demonstrated a lack of aversion in one or more attributes, to using treated, recycled greywater.
Resumo:
European air quality legislation has reduced emissions of air pollutants across Europe since the 1970s, affecting air quality, human health and regional climate. We used a coupled composition-climate model to simulate the impacts of European air quality legislation and technology measures implemented between 1970 and 2010. We contrast simulations using two emission scenarios; one with actual emissions in 2010 and the other with emissions that would have occurred in 2010 in the absence of technological improvements and end-of-pipe treatment measures in the energy, industrial and road transport sectors. European emissions of sulphur dioxide, black carbon (BC) and organic carbon in 2010 are 53%, 59% and 32% lower respectively compared to emissions that would have occurred in 2010 in the absence of legislative and technology measures. These emission reductions decreased simulated European annual mean concentrations of fine particulate matter(PM2.5) by 35%, sulphate by 44%, BC by 56% and particulate organic matter by 23%. The reduction in PM2.5 concentrations is calculated to have prevented 80 000 (37 000–116 000, at 95% confidence intervals) premature deaths annually across the European Union, resulting in a perceived financial benefit to society of US$232 billion annually (1.4% of 2010 EU GDP). The reduction in aerosol concentrations due to legislative and technology measures caused a positive change in the aerosol radiative effect at the top of atmosphere, reduced atmospheric absorption and also increased the amount of solar radiation incident at the surface over Europe. We used an energy budget approximation to estimate that these changes in the radiative balance have increased European annual mean surface temperatures and precipitation by 0.45 ± 0.11 °C and by 13 ± 0.8 mm yr−1 respectively. Our results show that the implementation of European legislation and technological improvements to reduce the emission of air pollutants has improved air quality and human health over Europe, as well as having an unintended impact on the regional radiative balance and climate.
Resumo:
Stingless bee honey samples were evaluated by sensory descriptive analysis using free choice profile methodology. Appearance, flavor, and aroma were described and the data were treated with Generalized Procrustes Analysis. Individual descriptive terms ranged from 8 to 20. Plotting the samples in a bidimensional plan indicated that appearance attributes (color and viscosity) and sweet, sour and acid flavor were strongly correlated with x axis (Dimension 1) while coconut, wood, acid, sour, and sweet flavor aroma attributes were correlated with y axis (Dimension 2). The affective test was also performed and with the exception of the Melipona scutellaris honey, all the other samples showed good acceptance. Honeys that were described as sweeter and less acid were preferred by nontrained assessors, indicating that the regular consumer recognizes honey produced by Apis mellifera bee as a standard.
Resumo:
A technique to calculate the current waveform for both close-up and remote short-circuit faults on DC supplied railways and subways is presented. Exact DC short-circuit current calculation is best performed by sophisticated computer transient simulations. However, an accurate simplified calculation method based on second-order approximation which can be easily executed with the help of a calculator or a spreadsheet program is proposed.