980 resultados para Wage payment systems
Resumo:
Payment systems all over the world have grown into a complicated web of solutions. This is more challenging in the case of mobile based payment systems. Mobile based payment systems are many and consist of different technologies providing different services. The diffusion of these various technologies in a market is uncertain. Diffusion theorists, for example Rogers, and Davis suggest how innovation is accepted in markets. In the case of electronic payment systems, the tale of Mondex vs Octopus throws interesting insights on diffusion. Our paper attempts to understand the success potential of various mobile payment technologies. We illustrate what we describe as technology breadth in mobile payment systems using data from payment systems all over the world (n=62). Our data shows an unexpected superiority of SMS technology, over other technologies like NFC, WAP and others. We also used a Delphi based survey (n=5) with experts to address the possibility that SMS will gain superiority in market diffusion. The economic conditions of a country, particularly in developing countries, the services availed and characteristics of the user (for example number of un-banked users in large populated countries) may put SMS in the forefront. This may be true more for micro payments using the mobile.
Resumo:
We make a comparative study of payment systems for E.U. -fifteen countries for the 1996-2002 period. Special attention is paid to the introduction of the new European single currency. The overall trend in payments is for a move from cash to noncash payment instruments, although electronic instruments are not widely used yet. We find a significant impact from the introduction of the new banknotes and coins on card use
Resumo:
Digital Rights Management Systems (DRMS) are seen by content providers as the appropriate tool to, on the one hand, fight piracy and, on the other hand, monetize their assets. Although these systems claim to be very powerful and include multiple protection technologies, there is a lack of understanding about how such systems are currently being implemented and used by content providers. The aim of this paper is twofold. First, it provides a theoretical basis through which we present shortly the seven core protection technologies of a DRMS. Second, this paper provides empirical evidence that the seven protection technologies outlined in the first section of this paper are the most commonly used technologies. It further evaluates to what extent these technologies are being used within the music and print industry. It concludes that the three main Technologies are encryption, password, and payment systems. However, there are some industry differences: the number of protection technologies used, the requirements for a DRMS, the required investment, or the perceived success of DRMS in fighting piracy.
Resumo:
For e-commerce to be successful, an efficient e-payment system is the key. The use of formal payment systems also enhances the ability to execute and manage monetary policies, which is essential for a country’s financial sector. Although Nigeria is a regional leader, the usage of e-Payments is still very low despite its many benefits and also attempts by the financial authorities. This therefore calls for an urgent need to investigate the factors that affect individual’s intention to adopt e-payment in Nigeria so that steps can be fashioned out to improve the situation. A survey was conducted to get individual perceptions of e-payments through the use of questionnaires designed based on the theoretical model developed. The results showed that awareness, knowledge of benefits, ease of use, reliability, trust and security, acceptability, accessibility and social influence are the main factors that influence individuals’ intention to adopt e-payments.
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways from payment systems to assisting the lives of elderly or disabled people. Security threats for these devices become increasingly dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level. Therefore, third-party developers have the opportunity to develop kernel-based low-level security tools which is not normal for smartphone platforms. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS for example, holding the greatest market share among all smartphone OSs, was closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners� privacy. In this work, we present our current results in analyzing the security of Android smartphones with a focus on its Linux side. Our results are not limited to Android, they are also applicable to Linux-based smartphones such as OpenMoko Neo FreeRunner. Our contribution in this work is three-fold. First, we analyze android framework and the Linux-kernel to check security functionalities. We survey wellaccepted security mechanisms and tools which can increase device security. We provide descriptions on how to adopt these security tools on Android kernel, and provide their overhead analysis in terms of resource usage. As open smartphones are released and may increase their market share similar to Symbian, they may attract attention of malware writers. Therefore, our second contribution focuses on malware detection techniques at the kernel level. We test applicability of existing signature and intrusion detection methods in Android environment. We focus on monitoring events on the kernel; that is, identifying critical kernel, log file, file system and network activity events, and devising efficient mechanisms to monitor them in a resource limited environment. Our third contribution involves initial results of our malware detection mechanism basing on static function call analysis. We identified approximately 105 Executable and Linking Format (ELF) executables installed to the Linux side of Android. We perform a statistical analysis on the function calls used by these applications. The results of the analysis can be compared to newly installed applications for detecting significant differences. Additionally, certain function calls indicate malicious activity. Therefore, we present a simple decision tree for deciding the suspiciousness of the corresponding application. Our results present a first step towards detecting malicious applications on Android-based devices.
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways, e.g. for payment systems or assisting the lives of elderly or disabled people. Security threats for these devices become more and more dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level and where third-party developers first time have the opportunity to develop kernel-based low-level security tools. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS, holding the greatest market share among all smartphone OSs, was even closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners privacy. Since signature-based approaches mainly detect known malwares, anomaly-based approaches can be a valuable addition to these systems. They base on mathematical algorithms processing data that describe the state of a certain device. For gaining this data, a monitoring client is needed that has to extract usable information (features) from the monitored system. Our approach follows a dual system for analyzing these features. On the one hand, functionality for on-device light-weight detection is provided. But since most algorithms are resource exhaustive, remote feature analysis is provided on the other hand. Having this dual system enables event-based detection that can react to the current detection need. In our ongoing research we aim to investigates the feasibility of light-weight on-device detection for certain occasions. On other occasions, whenever significant changes are detected on the device, the system can trigger remote detection with heavy-weight algorithms for better detection results. In the absence of the server respectively as a supplementary approach, we also consider a collaborative scenario. Here, mobile devices sharing a common objective are enabled by a collaboration module to share information, such as intrusion detection data and results. This is based on an ad-hoc network mode that can be provided by a WiFi or Bluetooth adapter nearly every smartphone possesses.
Resumo:
Congestion of traffic is one of the biggest challenges for urban cities in global perspective. Car traffic and traffic jams are causing major problems and the congestion is predicted to worsen in the future. The greenhouse effect has caused a severe threat to the environment globally. On the other hand from the point of view of companies and other economic parties time and money has been lost because of the congestion of traffic. This work studies some possible traffic payment systems for the Helsinki Metropolitan area introducing three optional models and concentrating on the point of view of the economic parties. Central part of this work is formed by a research questionnaire, which was conducted among companies located in the Helsinki area and where more than 1000 responses were gained. The study researches the approaches of the respondents to the area s current traffic system, its development and urban congestion pricing and the answers are analyzed according to the size, industry and location of the companies. The economic aspect is studied by economic theory of industrial location and by emphasizing the meaning of smoothly running traffic for the economic world. Chapter three presents detailed information about traffic congestion, how today s car-centered society has been formed, what concrete things congestion means for economic life and how traffic congestion can be limited. Theoretically it is examined how urban traffic payment systems are working using examples from London and Stockholm where successful traffic payment experiences exist. The literature review analyzes urban development, increasing car traffic and Helsinki Metropolitan area on a structural point of view. The fourth chapter introduces a case study, which concentrates on Helsinki Metropolitan area s different structures, the congestion situation in Helsinki and the introduction of the traffic payment system clarification. Currently the region is experiencing a phase where big changes are happening in the planning of traffic. The traffic systems are being unified to consider the whole region in the future. Also different advices for the increasing traffic congestion problems are needed. Chapter five concentrates on the questionnaire and theme interviews and introduces the research findings. The respondents overall opinion of the traffic payments is quite skeptical. There were some regional differences found and especially taxi, bus and cargo and transit enterprises shared the most negative opinion. Economic parties were worried especially because of the traffic congestion is causing harm for the business travel and the employees traveling to and from work. According to the respondents the best option from the traffic payment models was the ring model where the payment places would be situated inside the Ring Road III. Both the company representatives and other key decision makers see public transportation as a good and powerful tool to decrease traffic congestion. The only question, which remains, is where to find investors willing to invest in public transportation if economic representatives do not believe in pricing the traffic by for example traffic payment systems.
Resumo:
Consumer demand is revolutionizing the way products are being produced, distributed and marketed. In relation to the dairy sector in developing countries, aspects of milk quality are receiving more attention from both society and the government. However, milk quality management needs to be better addressed in dairy production systems to guarantee the access of stakeholders, mainly small-holders, into dairy markets. The present study is focused on an analysis of the interaction of the upstream part of the dairy supply chain (farmers and dairies) in the Mantaro Valley (Peruvian central Andes), in order to understand possible constraints both stakeholders face implementing milk quality controls and practices; and evaluate “ex-ante” how different strategies suggested to improve milk quality could affect farmers and processors’ profits. The analysis is based on three complementary field studies conducted between 2012 and 2013. Our work has shown that the presence of a dual supply chain combining both formal and informal markets has a direct impact on dairy production at the technical and organizational levels, affecting small formal dairy processors’ possibilities to implement contracts, including agreements on milk quality standards. The analysis of milk quality management from farms to dairy plants highlighted the poor hygiene in the study area, even when average values of milk composition were usually high. Some husbandry practices evaluated at farm level demonstrated cost effectiveness and a big impact on hygienic quality; however, regular application of these practices was limited, since small-scale farmers do not receive a bonus for producing hygienic milk. On the basis of these two results, we co-designed with formal small-scale dairy processors a simulation tool to show prospective scenarios, in which they could select their best product portfolio but also design milk payment systems to reward farmers’ with high milk quality performances. This type of approach allowed dairy processors to realize the importance of including milk quality management in their collection and manufacturing processes, especially in a context of high competition for milk supply. We concluded that the improvement of milk quality in a smallholder farming context requires a more coordinated effort among stakeholders. Successful implementation of strategies will depend on the willingness of small-scale dairy processors to reward farmers producing high milk quality; but also on the support from the State to provide incentives to the stakeholders in the formal sector.
Resumo:
La crisis que se desató en el mercado hipotecario en Estados Unidos en 2008 y que logró propagarse a lo largo de todo sistema financiero, dejó en evidencia el nivel de interconexión que actualmente existe entre las entidades del sector y sus relaciones con el sector productivo, dejando en evidencia la necesidad de identificar y caracterizar el riesgo sistémico inherente al sistema, para que de esta forma las entidades reguladoras busquen una estabilidad tanto individual, como del sistema en general. El presente documento muestra, a través de un modelo que combina el poder informativo de las redes y su adecuación a un modelo espacial auto regresivo (tipo panel), la importancia de incorporar al enfoque micro-prudencial (propuesto en Basilea II), una variable que capture el efecto de estar conectado con otras entidades, realizando así un análisis macro-prudencial (propuesto en Basilea III).
Resumo:
Introducción Los Grupos Relacionados de Diagnóstico (GRD) se han usado para determinar la calidad de la atención en varios sistemas de salud. Esto ha llevado a que se obtengan resultados en el mejoramiento continuo de la atención y del cuidado. El objetivo de este estudio es determinar desenlaces clínicos de los pacientes a quienes se les había realizado reemplazo de articulares según la complejidad clínica definida mediante GRD. Métodos Se realizó un estudio longitudinal descriptivo en el cual se incluyeron todos los pacientes que tuvieron cirugía de reemplazo total de hombro, cadera y rodilla entre 2012 y 2014. Se realizó la estratificación de los pacientes de acuerdo a tres niveles de complejidad dados por el sistema de GRD y se determinaron las proporciones de pacientes para las variables de estancia hospitalaria, enfermedad trombo-embólica, cardiovascular e infección del sitio operatorio. Resultados Se realizaron en total 886 reemplazos articulares de los cuales 40 (4.5%) presentaron complicaciones. Los eventos más frecuentes fueron las complicaciones coronarias, con una presencia de 2.4%. El GRD1, sin complicaciones ni comorbilidades, fue el que presentó mayor número de eventos. La estancia hospitalaria fue de 3.8 a 9.3 días para todos los reemplazos. Conclusiones Contrario a lo planteado en la hipótesis de estudio, se encontró que el primer GRD presentó el mayor número de complicaciones, lo que puede estar relacionado con el tamaño del grupo. Es necesario realizar nuevas investigaciones que soporten el uso de los GRD como herramienta para evaluar desenlaces clínicos.
Resumo:
This dissertation studies the effects of Information and Communication Technologies (ICT) on the banking sector and the payments system. It provides insight into how technology-induced changes occur, by exploring both the nature and scope of main technology innovations and evidencing their economic implications for banks and payment systems. Some parts in the dissertation are descriptive. They summarise the main technological developments in the field of finance and link them to economic policies. These parts are complemented with sections of the study that focus on assessing the extent of technology application to banking and payment activities. Finally, it includes also some work which borrows from the economic literature on banking. The need for an interdisciplinary approach arises from the complexity of the topic and the rapid path of change to which it is subject. The first chapter provides an overview of the influence of developments in ICT on the evolution of financial services and international capital flows. We include main indicators and discuss innovation in the financial sector, exchange rates and international capital flows. The chapter concludes with impact analysis and policy options regarding the international financial architecture, some monetary policy issues and the role of international institutions. The second chapter is a technology assessment study that focuses on the relationship between technology and money. The application of technology to payments systems is transforming the way we use money and, in some instances, is blurring the definition of what constitutes money. This chapter surveys the developments in electronic forms of payment and their relationship to the banking system. It also analyses the challenges posed by electronic money for regulators and policy makers, and in particular the opportunities created by two simultaneous processes: the Economic and Monetary Union and the increasing use of electronic payment instruments. The third chapter deals with the implications of developments in ICT on relationship banking. The financial intermediation literature explains relationship banking as a type of financial intermediation characterised by proprietary information and multiple interactions with customers. This form of banking is important for the financing of small and medium-sized enterprises. We discuss the effects of ICT on the banking sector as a whole and then apply these developments to the case of relationship banking. The fourth chapter is an empirical study of the effects of technology on the banking business, using a sample of data from the Spanish banking industry. The design of the study is based on some of the events described in the previous chapters, and also draws from the economic literature on banking. The study shows that developments in information management have differential effects on wholesale and retail banking activities. Finally, the last chapter is a technology assessment study on electronic payments systems in Spain and the European Union. It contains an analysis of existing payment systems and ongoing or planned initiatives in Spain. It forms part of a broader project comprising a series of country-specific analyses covering ten European countries. The main issues raised across the countries serve as the starting point to discuss implications of the development of electronic money for regulation and policies, and in particular, for monetary-policy making.
Resumo:
This study investigated the potential application of mid-infrared spectroscopy (MIR 4,000–900 cm−1) for the determination of milk coagulation properties (MCP), titratable acidity (TA), and pH in Brown Swiss milk samples (n = 1,064). Because MCP directly influence the efficiency of the cheese-making process, there is strong industrial interest in developing a rapid method for their assessment. Currently, the determination of MCP involves time-consuming laboratory-based measurements, and it is not feasible to carry out these measurements on the large numbers of milk samples associated with milk recording programs. Mid-infrared spectroscopy is an objective and nondestructive technique providing rapid real-time analysis of food compositional and quality parameters. Analysis of milk rennet coagulation time (RCT, min), curd firmness (a30, mm), TA (SH°/50 mL; SH° = Soxhlet-Henkel degree), and pH was carried out, and MIR data were recorded over the spectral range of 4,000 to 900 cm−1. Models were developed by partial least squares regression using untreated and pretreated spectra. The MCP, TA, and pH prediction models were improved by using the combined spectral ranges of 1,600 to 900 cm−1, 3,040 to 1,700 cm−1, and 4,000 to 3,470 cm−1. The root mean square errors of cross-validation for the developed models were 2.36 min (RCT, range 24.9 min), 6.86 mm (a30, range 58 mm), 0.25 SH°/50 mL (TA, range 3.58 SH°/50 mL), and 0.07 (pH, range 1.15). The most successfully predicted attributes were TA, RCT, and pH. The model for the prediction of TA provided approximate prediction (R2 = 0.66), whereas the predictive models developed for RCT and pH could discriminate between high and low values (R2 = 0.59 to 0.62). It was concluded that, although the models require further development to improve their accuracy before their application in industry, MIR spectroscopy has potential application for the assessment of RCT, TA, and pH during routine milk analysis in the dairy industry. The implementation of such models could be a means of improving MCP through phenotypic-based selection programs and to amend milk payment systems to incorporate MCP into their payment criteria.
Resumo:
Historically, payment systems and capital intermediation interact. Friedman (1959), and many observers of bank instabilities, have ad- vocated separating depositary from credit institutions. His proposal meets today an ever-increasing provision of inside money, and a short- age of monetary models of bank intermediation. In this paper, we eval- uate the proposal from a new angle, with a model in which isolating a safe payments system from commercial intermediation undermines information complementarities in banking activities. Some features of the environment resemble the models in Diamond and Dybvig (1983), and Kiyotaki and Wright (1989).
Resumo:
The Peer-to-Peer network paradigm is drawing the attention of both final users and researchers for its features. P2P networks shift from the classic client-server approach to a high level of decentralization where there is no central control and all the nodes should be able not only to require services, but to provide them to other peers as well. While on one hand such high level of decentralization might lead to interesting properties like scalability and fault tolerance, on the other hand it implies many new problems to deal with. A key feature of many P2P systems is openness, meaning that everybody is potentially able to join a network with no need for subscription or payment systems. The combination of openness and lack of central control makes it feasible for a user to free-ride, that is to increase its own benefit by using services without allocating resources to satisfy other peers’ requests. One of the main goals when designing a P2P system is therefore to achieve cooperation between users. Given the nature of P2P systems based on simple local interactions of many peers having partial knowledge of the whole system, an interesting way to achieve desired properties on a system scale might consist in obtaining them as emergent properties of the many interactions occurring at local node level. Two methods are typically used to face the problem of cooperation in P2P networks: 1) engineering emergent properties when designing the protocol; 2) study the system as a game and apply Game Theory techniques, especially to find Nash Equilibria in the game and to reach them making the system stable against possible deviant behaviors. In this work we present an evolutionary framework to enforce cooperative behaviour in P2P networks that is alternative to both the methods mentioned above. Our approach is based on an evolutionary algorithm inspired by computational sociology and evolutionary game theory, consisting in having each peer periodically trying to copy another peer which is performing better. The proposed algorithms, called SLAC and SLACER, draw inspiration from tag systems originated in computational sociology, the main idea behind the algorithm consists in having low performance nodes copying high performance ones. The algorithm is run locally by every node and leads to an evolution of the network both from the topology and from the nodes’ strategy point of view. Initial tests with a simple Prisoners’ Dilemma application show how SLAC is able to bring the network to a state of high cooperation independently from the initial network conditions. Interesting results are obtained when studying the effect of cheating nodes on SLAC algorithm. In fact in some cases selfish nodes rationally exploiting the system for their own benefit can actually improve system performance from the cooperation formation point of view. The final step is to apply our results to more realistic scenarios. We put our efforts in studying and improving the BitTorrent protocol. BitTorrent was chosen not only for its popularity but because it has many points in common with SLAC and SLACER algorithms, ranging from the game theoretical inspiration (tit-for-tat-like mechanism) to the swarms topology. We discovered fairness, meant as ratio between uploaded and downloaded data, to be a weakness of the original BitTorrent protocol and we drew inspiration from the knowledge of cooperation formation and maintenance mechanism derived from the development and analysis of SLAC and SLACER, to improve fairness and tackle freeriding and cheating in BitTorrent. We produced an extension of BitTorrent called BitFair that has been evaluated through simulation and has shown the abilities of enforcing fairness and tackling free-riding and cheating nodes.