807 resultados para Technology Acceptance Model TAM
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
Customers will not continue to pay for a service if it is perceived to be of poor quality, and/or of no value. With a paradigm shift towards business dependence on service orientated IS solutions [1], it is critical that alignment exists between service definition, delivery, and customer expectation, businesses are to ensure customer satisfaction. Services, and micro-service development, offer businesses a flexible structure for solution innovation, however, constant changes in technology, business and societal expectations means an iterative analysis solution is required to i) determine whether provider services adequately meet customer segment needs and expectations, and ii) to help guide business service innovation and development. In this paper, by incorporating multiple models, we propose a series of steps to help identify and prioritise service gaps. Moreover, the authors propose the Dual Semiosis Analysis Model, i.e. a tool that highlights where within the symbiotic customer / provider semiosis process, requirements misinterpretation, and/or service provision deficiencies occur. This paper offers the reader a powerful customer-centric tool, designed to help business managers highlight both what services are critical to customer quality perception, and where future innovation
Resumo:
Cities globally are in the midst of taking action to reduce greenhouse gas (GHG) emissions. After the vital step of emissions quantification, strategies must be developed to detail how emissions reductions targets will be achieved. The Pathways to Urban Reductions in Greenhouse Gas Emissions (PURGE) model allows the estimation of emissions from four pertinent urban sectors: electricity generation, buildings, private transportation, and waste. Additionally, the carbon storage from urban and regional forests is modeled. An emissions scenario is examined for a case study of the greater Toronto, Ontario, Canada, area using data on current technology stocks and government projections for stock change. The scenario presented suggests that even with some aggressive targets for technological adoption (especially in the transportation sector), it will be difficult to achieve the less ambitious 2050 emissions reduction goals of the Intergovernmental Panel on Climate Change. This is largely attributable to the long life of the building stock and limitations of current retrofit practices. Additionally, demand reduction (through transportation mode shifting and building occupant behavior) will be an important component of future emissions cuts.
Resumo:
A manufactured aeration and nanofiltration MBR greywater system was tested during continuous operation at the University of Reading, to demonstrate reliability in delivery of high quality treated greywater. Its treatment performance was evaluated against British Standard criteria [BSI (Greywater Systems—Part 1 Code of Practice: BS8525-1:2010. BS Press, 2010); (Greywater Systems—Part 2 Domestic Greywater Treatment, Requirements and Methods: BS 8525-2:2011. BS Press, 2011)]. The low carbon greywater recycling technology produced excellent analytical results as well as consistency in performance. User acceptance of such reliably treated greywater was then evaluated through user perception studies. The results inform the potential supply of treated greywater to student accommodation. Out of 135 questionnaire replies, 95% demonstrated a lack of aversion in one or more attributes, to using treated, recycled greywater.
Resumo:
European air quality legislation has reduced emissions of air pollutants across Europe since the 1970s, affecting air quality, human health and regional climate. We used a coupled composition-climate model to simulate the impacts of European air quality legislation and technology measures implemented between 1970 and 2010. We contrast simulations using two emission scenarios; one with actual emissions in 2010 and the other with emissions that would have occurred in 2010 in the absence of technological improvements and end-of-pipe treatment measures in the energy, industrial and road transport sectors. European emissions of sulphur dioxide, black carbon (BC) and organic carbon in 2010 are 53%, 59% and 32% lower respectively compared to emissions that would have occurred in 2010 in the absence of legislative and technology measures. These emission reductions decreased simulated European annual mean concentrations of fine particulate matter(PM2.5) by 35%, sulphate by 44%, BC by 56% and particulate organic matter by 23%. The reduction in PM2.5 concentrations is calculated to have prevented 80 000 (37 000–116 000, at 95% confidence intervals) premature deaths annually across the European Union, resulting in a perceived financial benefit to society of US$232 billion annually (1.4% of 2010 EU GDP). The reduction in aerosol concentrations due to legislative and technology measures caused a positive change in the aerosol radiative effect at the top of atmosphere, reduced atmospheric absorption and also increased the amount of solar radiation incident at the surface over Europe. We used an energy budget approximation to estimate that these changes in the radiative balance have increased European annual mean surface temperatures and precipitation by 0.45 ± 0.11 °C and by 13 ± 0.8 mm yr−1 respectively. Our results show that the implementation of European legislation and technological improvements to reduce the emission of air pollutants has improved air quality and human health over Europe, as well as having an unintended impact on the regional radiative balance and climate.
Resumo:
Stingless bee honey samples were evaluated by sensory descriptive analysis using free choice profile methodology. Appearance, flavor, and aroma were described and the data were treated with Generalized Procrustes Analysis. Individual descriptive terms ranged from 8 to 20. Plotting the samples in a bidimensional plan indicated that appearance attributes (color and viscosity) and sweet, sour and acid flavor were strongly correlated with x axis (Dimension 1) while coconut, wood, acid, sour, and sweet flavor aroma attributes were correlated with y axis (Dimension 2). The affective test was also performed and with the exception of the Melipona scutellaris honey, all the other samples showed good acceptance. Honeys that were described as sweeter and less acid were preferred by nontrained assessors, indicating that the regular consumer recognizes honey produced by Apis mellifera bee as a standard.
Resumo:
A technique to calculate the current waveform for both close-up and remote short-circuit faults on DC supplied railways and subways is presented. Exact DC short-circuit current calculation is best performed by sophisticated computer transient simulations. However, an accurate simplified calculation method based on second-order approximation which can be easily executed with the help of a calculator or a spreadsheet program is proposed.
Resumo:
This study determined the sensory shelf life of a commercial brand of chocolate and carrot cupcakes, aiming at increasing the current 120 days of shelf life to 180. Appearance, texture, flavor and overall quality of cakes stored at six different storage times were evaluated by 102 consumers. The data were analyzed by analysis of variance and linear regression. For both flavors, the texture presented a greater loss in acceptance during the storage period, showing an acceptance mean close to indifference on the hedonic scale at 120 days. Nevertheless, appearance, flavor and overall quality stayed acceptable up to 150 days. The end of shelf life was estimated at about 161 days for chocolate cakes and 150 days for carrot cakes. This study showed that the current 120 days of shelf life can be extended to 150 days for carrot cake and to 160 days for chocolate cake. However, the 180 days of shelf life desired by the company were not achieved. PRACTICAL APPLICATIONS This research shows the adequacy of using sensory acceptance tests to determine the shelf life of two food products (chocolate and carrot cupcakes). This practical application is useful because the precise determination of the shelf life of a food product is of vital importance for its commercial success. The maximum storage time should always be evaluated in the development or reformulation of new products, changes in packing or storage conditions. Once the physical-chemical and microbiological stability of a product is guaranteed, sensorial changes that could affect consumer acceptance will determine the end of the shelf life of a food product. Thus, the use of sensitive and reliable methods to estimate the sensory shelf life of a product is very important. Findings show the importance of determining the shelf life of each product separately and to avoid using the shelf time estimated for a specific product on other, similar products.
Resumo:
Developed countries have an even distribution of published papers on the seventeen model organisms. Developing countries have biased preferences for a few model organisms which are associated with endemic human diseases. A variant of the Hirsch-index, that we call the mean (mo)h-index (""model organism h-index""), shows an exponential relationship with the amount of papers published in each country on the selected model organisms. Developing countries cluster together with low mean (mo)h-indexes, even those with high number of publications. The growth curves of publications on the recent model Caenorhabditis elegans in developed countries shows different formats. We also analyzed the growth curves of indexed publications originating from developing countries. Brazil and South Korea were selected for this comparison. The most prevalent model organisms in those countries show different growth curves when compared to a global analysis, reflecting the size and composition of their research communities.
Resumo:
A time efficient optical model is proposed for GATE simulation of a LYSO scintillation matrix coupled to a photomultiplier. The purpose is to avoid the excessively long computation time when activating the optical processes in GATE. The usefulness of the model is demonstrated by comparing the simulated and experimental energy spectra obtained with the dual planar head equipment for dosimetry with a positron emission tomograph ( DoPET). The procedure to apply the model is divided in two steps. Firstly, a simplified simulation of a single crystal element of DoPET is used to fit an analytic function that models the optical attenuation inside the crystal. In a second step, the model is employed to calculate the influence of this attenuation in the energy registered by the tomograph. The use of the proposed optical model is around three orders of magnitude faster than a GATE simulation with optical processes enabled. A good agreement was found between the experimental and simulated data using the optical model. The results indicate that optical interactions inside the crystal elements play an important role on the energy resolution and induce a considerable degradation of the spectra information acquired by DoPET. Finally, the same approach employed by the proposed optical model could be useful to simulate a scintillation matrix coupled to a photomultiplier using single or dual readout scheme.
Resumo:
We construct static soliton solutions with non-zero Hopf topological charges to a theory which is the extended Skyrme-Faddeev model with a further quartic term in derivatives. We use an axially symmetric ansatz based on toroidal coordinates, and solve the resulting two coupled nonlinear partial differential equations in two variables by a successive over-relaxation method. We construct numerical solutions with the Hopf charge up to 4. The solutions present an interesting behavior under the changes of a special combination of the coupling constants of the quartic terms.
Resumo:
This paper proposes a novel way to combine different observation models in a particle filter framework. This, so called, auto-adjustable observation model, enhance the particle filter accuracy when the tracked objects overlap without infringing a great runtime penalty to the whole tracking system. The approach has been tested under two important real world situations related to animal behavior: mice and larvae tracking. The proposal was compared to some state-of-art approaches and the results show, under the datasets tested, that a good trade-off between accuracy and runtime can be achieved using an auto-adjustable observation model. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
I denna studie, som bedrivits på Landstinget Dalarna, har kommunikationssystemet Microsoft Lync studerats. Lync består av kommunikationsmodulerna chatt, video- och telefonkonferens samt dela dokument. Syftet med denna rapport är att förklara hur utbildning kan påverka människors acceptans för ett kommunikationssystem samt vilka orsaker som kan finnas för att det inte används. För att kunna besvara studiens syfte användes ett kvasiexperiment som genomfördes genom två intervjuomgångar och en utbildning mellan dessa intervjuomgångar. Utifrån intervjuomgångarna kunde slutsatser dras om hur utbildningen hade påverkat acceptansen för kommunikationssystemet Lync. Modellen Unified Theory of Acceptance and Use of Technology (UTAUT) användes för att studera de olika faktorer som påverkar acceptansen för teknik. Slutsatsen ska besvara frågorna, om utbildning påverkar människors acceptans och vad som kan vara orsaker till att ett kommunikationssystem som Lync inte används. Efter utbildningen hade en viss påverkan skett för kommunikationsmodulen chatt, inställningen hade ändrats och respondenterna såg en ökad nytta efter utbildningen. Utbildningen hade ingen påverkan när respondenterna inte såg nytta.
Resumo:
The p-median model is used to locate P facilities to serve a geographically distributed population. Conventionally, it is assumed that the population always travels to the nearest facility. Drezner and Drezner (2006, 2007) provide three arguments on why this assumption might be incorrect, and they introduce the extended the gravity p-median model to relax the assumption. We favour the gravity p-median model, but we note that in an applied setting, Drezner and Drezner’s arguments are incomplete. In this communication, we point at the existence of a fourth compelling argument for the gravity p-median model.