816 resultados para 080607 Information Engineering and Theory
Resumo:
In order to run a successful business, today’s manager needs to combine business skills with an understanding of information systems and the opportunities and benefits that they bring to an organisation. Starting from basic concepts, this book provides a comprehensive and accessible guide to: •understanding the technology of business information systems; •choosing the right information system for an organisation; •developing and managing an efficient business information system; •employing information systems strategically to achieve organisational goals. Taking a problem-solving approach, Business Information Systems looks at information systems theory within the context of the most recent business and technological advances. This thoroughly revised new edition has updated and expanded coverage of contemporary key topics such as: •Web 2.0 •enterprise systems •implementation and design of IS strategy •outsourcing
Resumo:
Using the wisdom of crowds---combining many individual forecasts to obtain an aggregate estimate---can be an effective technique for improving forecast accuracy. When individual forecasts are drawn from independent and identical information sources, a simple average provides the optimal crowd forecast. However, correlated forecast errors greatly limit the ability of the wisdom of crowds to recover the truth. In practice, this dependence often emerges because information is shared: forecasters may to a large extent draw on the same data when formulating their responses.
To address this problem, I propose an elicitation procedure in which each respondent is asked to provide both their own best forecast and a guess of the average forecast that will be given by all other respondents. I study optimal responses in a stylized information setting and develop an aggregation method, called pivoting, which separates individual forecasts into shared and private information and then recombines these results in the optimal manner. I develop a tailored pivoting procedure for each of three information models, and introduce a simple and robust variant that outperforms the simple average across a variety of settings.
In three experiments, I investigate the method and the accuracy of the crowd forecasts. In the first study, I vary the shared and private information in a controlled environment, while the latter two studies examine forecasts in real-world contexts. Overall, the data suggest that a simple minimal pivoting procedure provides an effective aggregation technique that can significantly outperform the crowd average.
Resumo:
One of the leading motivations behind the multilingual semantic web is to make resources accessible digitally in an online global multilingual context. Consequently, it is fundamental for knowledge bases to find a way to manage multilingualism and thus be equipped with those procedures for its conceptual modelling. In this context, the goal of this paper is to discuss how common-sense knowledge and cultural knowledge are modelled in a multilingual framework. More particularly, multilingualism and conceptual modelling are dealt with from the perspective of FunGramKB, a lexico-conceptual knowledge base for natural language understanding. This project argues for a clear division between the lexical and the conceptual dimensions of knowledge. Moreover, the conceptual layer is organized into three modules, which result from a strong commitment towards capturing semantic knowledge (Ontology), procedural knowledge (Cognicon) and episodic knowledge (Onomasticon). Cultural mismatches are discussed and formally represented at the three conceptual levels of FunGramKB.
Resumo:
In Marxist frameworks “distributive justice” depends on extracting value through a centralized state. Many new social movements—peer to peer economy, maker activism, community agriculture, queer ecology, etc.—take the opposite approach, keeping value in its unalienated form and allowing it to freely circulate from the bottom up. Unlike Marxism, there is no general theory for bottom-up, unalienated value circulation. This paper examines the concept of “generative justice” through an historical contrast between Marx’s writings and the indigenous cultures that he drew upon. Marx erroneously concluded that while indigenous cultures had unalienated forms of production, only centralized value extraction could allow the productivity needed for a high quality of life. To the contrary, indigenous cultures now provide a robust model for the “gift economy” that underpins open source technological production, agroecology, and restorative approaches to civil rights. Expanding Marx’s concept of unalienated labor value to include unalienated ecological (nonhuman) value, as well as the domain of freedom in speech, sexual orientation, spirituality and other forms of “expressive” value, we arrive at an historically informed perspective for generative justice.
Resumo:
Current Ambient Intelligence and Intelligent Environment research focuses on the interpretation of a subject’s behaviour at the activity level by logging the Activity of Daily Living (ADL) such as eating, cooking, etc. In general, the sensors employed (e.g. PIR sensors, contact sensors) provide low resolution information. Meanwhile, the expansion of ubiquitous computing allows researchers to gather additional information from different types of sensor which is possible to improve activity analysis. Based on the previous research about sitting posture detection, this research attempts to further analyses human sitting activity. The aim of this research is to use non-intrusive low cost pressure sensor embedded chair system to recognize a subject’s activity by using their detected postures. There are three steps for this research, the first step is to find a hardware solution for low cost sitting posture detection, second step is to find a suitable strategy of sitting posture detection and the last step is to correlate the time-ordered sitting posture sequences with sitting activity. The author initiated a prototype type of sensing system called IntelliChair for sitting posture detection. Two experiments are proceeded in order to determine the hardware architecture of IntelliChair system. The prototype looks at the sensor selection and integration of various sensor and indicates the best for a low cost, non-intrusive system. Subsequently, this research implements signal process theory to explore the frequency feature of sitting posture, for the purpose of determining a suitable sampling rate for IntelliChair system. For second and third step, ten subjects are recruited for the sitting posture data and sitting activity data collection. The former dataset is collected byasking subjects to perform certain pre-defined sitting postures on IntelliChair and it is used for posture recognition experiment. The latter dataset is collected by asking the subjects to perform their normal sitting activity routine on IntelliChair for four hours, and the dataset is used for activity modelling and recognition experiment. For the posture recognition experiment, two Support Vector Machine (SVM) based classifiers are trained (one for spine postures and the other one for leg postures), and their performance evaluated. Hidden Markov Model is utilized for sitting activity modelling and recognition in order to establish the selected sitting activities from sitting posture sequences.2. After experimenting with possible sensors, Force Sensing Resistor (FSR) is selected as the pressure sensing unit for IntelliChair. Eight FSRs are mounted on the seat and back of a chair to gather haptic (i.e., touch-based) posture information. Furthermore, the research explores the possibility of using alternative non-intrusive sensing technology (i.e. vision based Kinect Sensor from Microsoft) and find out the Kinect sensor is not reliable for sitting posture detection due to the joint drifting problem. A suitable sampling rate for IntelliChair is determined according to the experiment result which is 6 Hz. The posture classification performance shows that the SVM based classifier is robust to “familiar” subject data (accuracy is 99.8% with spine postures and 99.9% with leg postures). When dealing with “unfamiliar” subject data, the accuracy is 80.7% for spine posture classification and 42.3% for leg posture classification. The result of activity recognition achieves 41.27% accuracy among four selected activities (i.e. relax, play game, working with PC and watching video). The result of this thesis shows that different individual body characteristics and sitting habits influence both sitting posture and sitting activity recognition. In this case, it suggests that IntelliChair is suitable for individual usage but a training stage is required.
Resumo:
Drawing on historical research, personal interviews, performance analysis, and my own embodied experience as a participant-observer in several clown workshops, I explore the diverse historical influences on clown theatre as it is conceived today. I then investigate how the concept of embodied knowledge is reflected in red-nose clown pedagogy. Finally, I argue that through shared embodied knowledge spectators are able to perceive and appreciate the humor of clown theatre in performance. I propose that clown theatre represents a reaction to the eroding personal connections prompted by the so-called information age, and that humor in clown theatre is a revealing index of socio-cultural values, attitudes, dispositions, and concerns.
Resumo:
Peer-to-peer information sharing has fundamentally changed customer decision-making process. Recent developments in information technologies have enabled digital sharing platforms to influence various granular aspects of the information sharing process. Despite the growing importance of digital information sharing, little research has examined the optimal design choices for a platform seeking to maximize returns from information sharing. My dissertation seeks to fill this gap. Specifically, I study novel interventions that can be implemented by the platform at different stages of the information sharing. In collaboration with a leading for-profit platform and a non-profit platform, I conduct three large-scale field experiments to causally identify the impact of these interventions on customers’ sharing behaviors as well as the sharing outcomes. The first essay examines whether and how a firm can enhance social contagion by simply varying the message shared by customers with their friends. Using a large randomized field experiment, I find that i) adding only information about the sender’s purchase status increases the likelihood of recipients’ purchase; ii) adding only information about referral reward increases recipients’ follow-up referrals; and iii) adding information about both the sender’s purchase as well as the referral rewards increases neither the likelihood of purchase nor follow-up referrals. I then discuss the underlying mechanisms. The second essay studies whether and how a firm can design unconditional incentive to engage customers who already reveal willingness to share. I conduct a field experiment to examine the impact of incentive design on sender’s purchase as well as further referral behavior. I find evidence that incentive structure has a significant, but interestingly opposing, impact on both outcomes. The results also provide insights about senders’ motives in sharing. The third essay examines whether and how a non-profit platform can use mobile messaging to leverage recipients’ social ties to encourage blood donation. I design a large field experiment to causally identify the impact of different types of information and incentives on donor’s self-donation and group donation behavior. My results show that non-profits can stimulate group effect and increase blood donation, but only with group reward. Such group reward works by motivating a different donor population. In summary, the findings from the three studies will offer valuable insights for platforms and social enterprises on how to engineer digital platforms to create social contagion. The rich data from randomized experiments and complementary sources (archive and survey) also allows me to test the underlying mechanism at work. In this way, my dissertation provides both managerial implication and theoretical contribution to the phenomenon of peer-to-peer information sharing.
Resumo:
International audience
Resumo:
The large amount of information in electronic contracts hampers their establishment due to high complexity. An approach inspired in Software Product Line (PL) and based on feature modelling was proposed to make this process more systematic through information reuse and structuring. By assessing the feature-based approach in relation to a proposed set of requirements, it was showed that the approach does not allow the price of services and of Quality of Services (QoS) attributes to be considered in the negotiation and included in the electronic contract. Thus, this paper also presents an extension of such approach in which prices and price types associated to Web services and QoS levels are applied. An extended toolkit prototype is also presented as well as an experiment example of the proposed approach.
Resumo:
Purpose - The purpose of this paper is to examine whether the level of logistics information systems (LIS) adoption in manufacturing companies is influenced by organizational profile variables, such as the company`s size, the nature of its operations and their subsectors. Design/methodology/approach - A review of the mainstream literature on US was carried out to identify the factors influencing the adoption of such information systems and also some research gaps. The empirical study`s strategy is based on a survey research in Brazilian manufacturing firms from the capital goods industry. Data collected were analyzed through Kruskall-Wallis and Mann Whitney`s non-parametric tests. Findings - The analysis indicates that characteristics such as the size of companies and the nature of their operations influence the levels of LIS adoption, whilst comparisons regarding the subsectors appeared to be of little influence. Originality/value - This is the first known study to examine the influence of organizational profiles such as size, nature of operations and subsector on the level of US adoption in manufacturing companies. Moreover, it is unique in portraying the Brazilian scenario on this topic and addressing the adoption of seven types of LIS in a single study.
Resumo:
A large percentage of pile caps support only one column, and the pile caps in turn are supported by only a few piles. These are typically short and deep members with overall span-depth ratios of less than 1.5. Codes of practice do not provide uniform treatment for the design of these types of pile caps. These members have traditionally been designed as beams spanning between piles with the depth selected to avoid shear failures and the amount of longitudinal reinforcement selected to provide sufficient flexural capacity as calculated by the engineering beam theory. More recently, the strut-and-tie method has been used for the design of pile caps (disturbed or D-region) in which the load path is envisaged to be a three-dimensional truss, with compressive forces being supported by concrete compressive struts between the column and piles and tensile forces being carried by reinforcing steel located between piles. Both of these models have not provided uniform factors of safety against failure or been able to predict whether failure will occur by flexure (ductile mode) or shear (fragile mode). In this paper, an analytical model based on the strut-and-tie approach is presented. The proposed model has been calibrated using an extensive experimental database of pile caps subjected to compression and evaluated analytically for more complex loading conditions. It has been proven to be applicable across a broad range of test data and can predict the failures modes, cracking, yielding, and failure loads of four-pile caps with reasonable accuracy.
Resumo:
In this paper a bond graph methodology is used to model incompressible fluid flows with viscous and thermal effects. The distinctive characteristic of these flows is the role of pressure, which does not behave as a state variable but as a function that must act in such a way that the resulting velocity field has divergence zero. Velocity and entropy per unit volume are used as independent variables for a single-phase, single-component flow. Time-dependent nodal values and interpolation functions are introduced to represent the flow field, from which nodal vectors of velocity and entropy are defined as state variables. The system for momentum and continuity equations is coincident with the one obtained by using the Galerkin method for the weak formulation of the problem in finite elements. The integral incompressibility constraint is derived based on the integral conservation of mechanical energy. The weak formulation for thermal energy equation is modeled with true bond graph elements in terms of nodal vectors of temperature and entropy rates, resulting a Petrov-Galerkin method. The resulting bond graph shows the coupling between mechanical and thermal energy domains through the viscous dissipation term. All kind of boundary conditions are handled consistently and can be represented as generalized effort or flow sources. A procedure for causality assignment is derived for the resulting graph, satisfying the Second principle of Thermodynamics. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The theory of Owicki and Gries has been used as a platform for safety-based verifcation and derivation of concurrent programs. It has also been integrated with the progress logic of UNITY which has allowed newer techniques of progress-based verifcation and derivation to be developed. However, a theoretical basis for the integrated theory has thus far been missing. In this paper, we provide a theoretical background for the logic of Owicki and Gries integrated with the logic of progress from UNITY. An operational semantics for the new framework is provided which is used to prove soundness of the progress logic.
Resumo:
Power system real time security assessment is one of the fundamental modules of the electricity markets. Typically, when a contingency occurs, it is required that security assessment and enhancement module shall be ready for action within about 20 minutes’ time to meet the real time requirement. The recent California black out again highlighted the importance of system security. This paper proposed an approach for power system security assessment and enhancement based on the information provided from the pre-defined system parameter space. The proposed scheme opens up an efficient way for real time security assessment and enhancement in a competitive electricity market for single contingency case