883 resultados para Information theory in aesthetics
Resumo:
Chlamydia is a common sexually transmitted infection that has potentially serious consequences unless detected and treated early. The health service in the UK offers clinic-based testing for chlamydia but uptake is low. Identifying the predictors of testing behaviours may inform interventions to increase uptake. Self-tests for chlamydia may facilitate testing and treatment in people who avoid clinic-based testing. Self-testing and being tested by a health care professional (HCP) involve two contrasting contexts that may influence testing behaviour. However, little is known about how predictors of behaviour differ as a function of context. In this study, theoretical models of behaviour were used to assess factors that may predict intention to test in two different contexts: self-testing and being tested by a HCP. Individuals searching for or reading about chlamydia testing online were recruited using Google Adwords. Participants completed an online questionnaire that addressed previous testing behaviour and measured constructs of the Theory of Planned Behaviour and Protection Motivation Theory, which propose a total of eight possible predictors of intention. The questionnaire was completed by 310 participants. Sufficient data for multiple regression were provided by 102 and 118 respondents for self-testing and testing by a HCP respectively. Intention to self-test was predicted by vulnerability and self-efficacy, with a trend-level effect for response efficacy. Intention to be tested by a HCP was predicted by vulnerability, attitude and subjective norm. Thus, intentions to carry out two testing behaviours with very similar goals can have different predictors depending on test context. We conclude that interventions to increase self-testing should be based on evidence specifically related to test context.
Resumo:
One dominant feature of the modern manufacturing chains is the movement of goods. Manufacturing companies would remain an unprofitable investment if the supplies/logistics of raw materials, semi-finished products or final goods are not handled in an effective way. Both levels of a modern manufacturing chain-actual production and logistics-are characterized by continuous data creation at a much faster rate than they can be meaningfully analyzed and acted upon manually. Often, instant and reliable decisions need to be taken based on huge, previously inconceivable amounts of heterogeneous, contradictory or incomplete data. The paper will highlight aspects of information flows related to business process data visibility and observability in modern manufacturing networks. An information management platform developed in the framework of the EU FP7 project ADVANCE will be presented.
Resumo:
Information Platform has a major impact on the core activities and development of businesses. In this connection it is necessary for students of economics (the future leaders of such entities) to submit any problems related to the Information Platform, the risks they may pose, and exemplary approach to solve part or all of the problems and minimizing risks. The current issue examines the adaptation of the above problems when presenting them to students in economic majors. To the students are presented generalizations based on long observation of the occurrence and development of Information Platforms to the businesses in Bulgaria in a growing market economy.
Resumo:
This is an extended version of an article presented at the Second International Conference on Software, Services and Semantic Technologies, Sofia, Bulgaria, 11–12 September 2010.
Resumo:
The purpose of this article is to evaluate the effectiveness of learning by doing as a practical tool for managing the training of students in "Library Management" at the ULSIT, Sofia, Bulgaria, by using the creation of project 'Data Base “Bulgarian Revival Towns” (CD), financed by Bulgarian Ministry of Education, Youth and Science (1/D002/144/13.10.2011) headed by Prof. DSc Ivanka Yankova, which aims to create new information resource for the towns which will serve the needs of scientific researches. By participating in generating the an array in the database through searching, selection and digitization of documents from these period, at the same time students get an opportunity to expand their skills to work effectively in a team, finding the interdisciplinary, a causal connection between the studied items, objects and subjects and foremost – practical experience in the field of digitization, information behavior, strategies for information search, etc. This method achieves good results for the accumulation of sustainable knowledge and it generates motivation to work in the field of library and information professions.
Resumo:
Local Government Authorities (LGAs) are mainly characterised as information-intensive organisations. To satisfy their information requirements, effective information sharing within and among LGAs is necessary. Nevertheless, the dilemma of Inter-Organisational Information Sharing (IOIS) has been regarded as an inevitable issue for the public sector. Despite a decade of active research and practice, the field lacks a comprehensive framework to examine the factors influencing Electronic Information Sharing (EIS) among LGAs. The research presented in this paper contributes towards resolving this problem by developing a conceptual framework of factors influencing EIS in Government-to-Government (G2G) collaboration. By presenting this model, we attempt to clarify that EIS in LGAs is affected by a combination of environmental, organisational, business process, and technological factors and that it should not be scrutinised merely from a technical perspective. To validate the conceptual rationale, multiple case study based research strategy was selected. From an analysis of the empirical data from two case organisations, this paper exemplifies the importance (i.e. prioritisation) of these factors in influencing EIS by utilising the Analytical Hierarchy Process (AHP) technique. The intent herein is to offer LGA decision-makers with a systematic decision-making process in realising the importance (i.e. from most important to least important) of EIS influential factors. This systematic process will also assist LGA decision-makers in better interpreting EIS and its underlying problems. The research reported herein should be of interest to both academics and practitioners who are involved in IOIS, in general, and collaborative e-Government, in particular. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
2000 Mathematics Subject Classification: 41A25, 41A36.
Resumo:
A dolgozat a politikatudomány és közgazdaság-tudomány határán álló kivonulás-tiltakozás-hűség modell lényegét járja körbe. Az Albert O. Hirschman által felépített gondolati keret szemléleti frissessége és sajátos megközelítési módja rendkívül termékenyítően hatott az elmúlt évtizedekben a társadalomtudományok fejlődésére. Mégis viszonylag periferikus helyet foglal el a közgazdaság-tudományon belül, miközben például a rendszerváltás és az azt követő társadalmi feszültségek megjelenése szempontjából is sokoldalúan használható fogalmi keretről van szó. A válság indukálta turbulens környezet még inkább rávilágít arra, hogy ma is időszerű perspektívát kínál a hanyatlás politikai gazdaságtana, azaz hasznos elemzési kapaszkodó nyújt az úgynevezett Hirschman-trilemma a társadalmi és gazdasági folyamatok értelmezéséhez. ________ The article concerns the "exit, voice, and loyalty" concept, which straddles the border of political science and economics. This theoretical framework, invented by Albert O. Hirschman, has exercised a fruitful influence in the social sciences in the last few decades, through its fresh features and original approach. However, it holds a peripheral position in economics and plays an undervalued role in eco-nomic education in Hungary, even though it can be flexibly applied in analyses of such phenomena as the economic transition and the ensuing social tensions. Moreover the very turbulence of the conditions brought about by the crisis show that the political economy of decline offers a relevant perspective, so that the Hirschman trilemma is a practical analytical tool for understanding social and economic processes.
Resumo:
The search-experience-credence framework from economics of information, the human-environment relations models from environmental psychology, and the consumer evaluation process from services marketing provide a conceptual basis for testing the model of "Pre-purchase Information Utilization in Service Physical Environments." The model addresses the effects of informational signs, as a dimension of the service physical environment, on consumers' perceptions (perceived veracity and perceived performance risk), emotions (pleasure) and behavior (willingness to buy). The informational signs provide attribute quality information (search and experience) through non-personal sources of information (simulated word-of-mouth and non-personal advocate sources).^ This dissertation examines: (1) the hypothesized relationships addressed in the model of "Pre-purchase Information Utilization in Service Physical Environments" among informational signs, perceived veracity, perceived performance risk, pleasure, and willingness to buy, and (2) the effects of attribute quality information and sources of information on consumers' perceived veracity and perceived performance risk.^ This research is the first in-depth study about the role and effects of information in service physical environments. Using a 2 x 2 between subjects experimental research procedure, undergraduate students were exposed to the informational signs in a simulated service physical environment. The service physical environments were simulated through color photographic slides.^ The results of the study suggest that: (1) the relationship between informational signs and willingness to buy is mediated by perceived veracity, perceived performance risk and pleasure, (2) experience attribute information shows higher perceived veracity and lower perceived performance risk when compared to search attribute information, and (3) information provided through simulated word-of-mouth shows higher perceived veracity and lower perceived performance risk when compared to information provided through non-personal advocate sources. ^
Resumo:
If we classify variables in a program into various security levels, then a secure information flow analysis aims to verify statically that information in a program can flow only in ways consistent with the specified security levels. One well-studied approach is to formulate the rules of the secure information flow analysis as a type system. A major trend of recent research focuses on how to accommodate various sophisticated modern language features. However, this approach often leads to overly complicated and restrictive type systems, making them unfit for practical use. Also, problems essential to practical use, such as type inference and error reporting, have received little attention. This dissertation identified and solved major theoretical and practical hurdles to the application of secure information flow. ^ We adopted a minimalist approach to designing our language to ensure a simple lenient type system. We started out with a small simple imperative language and only added features that we deemed most important for practical use. One language feature we addressed is arrays. Due to the various leaking channels associated with array operations, arrays have received complicated and restrictive typing rules in other secure languages. We presented a novel approach for lenient array operations, which lead to simple and lenient typing of arrays. ^ Type inference is necessary because usually a user is only concerned with the security types for input/output variables of a program and would like to have all types for auxiliary variables inferred automatically. We presented a type inference algorithm B and proved its soundness and completeness. Moreover, algorithm B stays close to the program and the type system and therefore facilitates informative error reporting that is generated in a cascading fashion. Algorithm B and error reporting have been implemented and tested. ^ Lastly, we presented a novel framework for developing applications that ensure user information privacy. In this framework, core computations are defined as code modules that involve input/output data from multiple parties. Incrementally, secure flow policies are refined based on feedback from the type checking/inference. Core computations only interact with code modules from involved parties through well-defined interfaces. All code modules are digitally signed to ensure their authenticity and integrity. ^
Resumo:
An iterative travel time forecasting scheme, named the Advanced Multilane Prediction based Real-time Fastest Path (AMPRFP) algorithm, is presented in this dissertation. This scheme is derived from the conventional kernel estimator based prediction model by the association of real-time nonlinear impacts that caused by neighboring arcs’ traffic patterns with the historical traffic behaviors. The AMPRFP algorithm is evaluated by prediction of the travel time of congested arcs in the urban area of Jacksonville City. Experiment results illustrate that the proposed scheme is able to significantly reduce both the relative mean error (RME) and the root-mean-squared error (RMSE) of the predicted travel time. To obtain high quality real-time traffic information, which is essential to the performance of the AMPRFP algorithm, a data clean scheme enhanced empirical learning (DCSEEL) algorithm is also introduced. This novel method investigates the correlation between distance and direction in the geometrical map, which is not considered in existing fingerprint localization methods. Specifically, empirical learning methods are applied to minimize the error that exists in the estimated distance. A direction filter is developed to clean joints that have negative influence to the localization accuracy. Synthetic experiments in urban, suburban and rural environments are designed to evaluate the performance of DCSEEL algorithm in determining the cellular probe’s position. The results show that the cellular probe’s localization accuracy can be notably improved by the DCSEEL algorithm. Additionally, a new fast correlation technique for overcoming the time efficiency problem of the existing correlation algorithm based floating car data (FCD) technique is developed. The matching process is transformed into a 1-dimensional (1-D) curve matching problem and the Fast Normalized Cross-Correlation (FNCC) algorithm is introduced to supersede the Pearson product Moment Correlation Co-efficient (PMCC) algorithm in order to achieve the real-time requirement of the FCD method. The fast correlation technique shows a significant improvement in reducing the computational cost without affecting the accuracy of the matching process.
Resumo:
This paper analyzes how José Lopéz’s participatory action research and transformational learning theory addresses the oppressed Puerto Rican experience. The paper examines the historical experience of colonialism, explains these two theories, and explores Lopéz’s adult education work in the Puerto Rican community using participatory action research and transformational learning.
Resumo:
Outsourcing of informational services, a growing trend outside the hospitality industry for several years, is the process of contracting with an outside vendor to take over all or part of a company's information processing needs. The author examines the pros and cons of outscourcing to help the hospitality industry determine if this si a business practice to be considered.
Resumo:
Postprint
Resumo:
Purpose: The purpose of the research described in this paper is to disentangle the rhetoric from the reality in relation to supply chain management (SCM) adoption in practice. There is significant evidence of a divergence between theory and practice in the field of SCM. Research Approach: The authors’ review of the extant SCM literature highlighted a lack of replication studies in SCM, leading to the concept of refined replication being developed. The authors conducted a refined replication of the work of Sweeney et al. (2015) where a new SCM definitional construct – the Four Fundamentals – was proposed. The work presented in this article refines the previous study but adopts the same three-phase approach: focussed interviews, a questionnaire survey, and focus groups. This article covers the second phase of the refined replication study and describes an integrated research design of a questionnaire research to be undertaken in Britain. Findings and Originality: The article presents an integrated research design of a questionnaire research with emphases on the refined replication of previous work of Sweeney et al. (2015) carried out in Ireland and adapting it to the British context. Research Impact: The authors introduce the concept of refined replication in SCM research. This allows previous research to be built upon in order to test understanding of SCM theory and its practical implementation - based on the Four Fundamentals construct - among SCM professionals in Britain. Practical Impact: The article presents the integrated research design of a questionnaire research that may be used in similar studies.