938 resultados para Security, usability, digital signature


Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is now generally accepted that cyber crime represents a big threat to organisations, and that they need to take appropriate action to protect their valuable information assets. However, current research shows that, although small businesses understand that they are potentially vulnerable, many are still not taking sufficient action to counteract the threat. Last year, the authors sought, through a more generalised but categorised attitudinal study, to explore the reasons why smaller SMEs in particular were reluctant to engage with accepted principles for protecting their data. The results showed that SMEs understood many of the issues. They were prepared to spend more but were particularly suspicious about spending on information assurance. The authors’ current research again focuses on SME attitudes but this time the survey asks only questions directly relating to information assurance and the standards available, in an attempt to try to understand exactly what is causing them to shy away from getting the badge or certificate that would demonstrate to customers and business partners that they take cyber security seriously. As with last year’s study, the results and analysis provide useful pointers towards the broader business environment changes that might cause SMEs to be more interested in working towards an appropriate cyber security standard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

End users urgently request using mobile devices at their workplace. They know these devices from their private life and appreciate functionality and usability, and want to benefit from these advantages at work as well. Limitations and restrictions would not be accepted by them. On the contrary, companies are obliged to employ substantial organizational and technical measures to ensure data security and compliance when allowing to use mobile devices at the workplace. So far, only individual arrangements have been presented addressing single issues in ensuring data security and compliance. However, companies need to follow a comprehensive set of measures addressing all relevant aspects of data security and compliance in order to play it safe. Thus, in this paper at first technical architectures for using mobile devices in enterprise IT are reviewed. Thereafter a set of compliance rules is presented and, as major contribution, technical measures are explained that enable a company to integrate mobile devices into enterprise IT while still complying with these rules comprehensively. Depending on the company context, one or more of the technical architectures have to be chosen impacting the specific technical measures for compliance as elaborated in this paper. Altogether this paper, for the first time, correlates technical architectures for using mobile devices at the workplace with technical measures to assure data security and compliance according to a comprehensive set of rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human factor is often recognised as a major aspect of cyber-security research. Risk and situational perception are identified as key factors in the decision making process, often playing a lead role in the adoption of security mechanisms. However, risk awareness and perception have been poorly investigated in the field of eHealth wearables. Whilst end-users often have limited understanding of privacy and security of wearables, assessing the perceived risks and consequences will help shape the usability of future security mechanisms. This paper present a survey of the the risks and situational awareness in eHealth services. An analysis of the lack of security and privacy measures in connected health devices is described with recommendations to circumvent critical situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This presentation provided the results of a two-round iterative study of WorldCat UMD usability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bikeshares promote healthy lifestyles and sustainability among commuters, casual riders, and tourists. However, the central pillar of modern systems, the bike station, cannot be easily integrated into a compact college campus. Fixed stations lack the flexibility to meet the needs of college students who make quick, short-distance trips. Additionally, the necessary cost of implementing and maintaining each station prohibits increasing the number of stations for user convenience. Therefore, the team developed a stationless bikeshare based on a smartlock permanently attached to bicycles in the system. The smartlock system design incorporates several innovative approaches to provide usability, security, and reliability that overcome the limitations of a station centered design. A focus group discussion allowed the team to receive feedback on the early lock, system, and website designs, identify improvements and craft a pleasant user experience. The team designed a unique, two-step lock system that is intuitive to operate while mitigating user error. To ensure security, user access is limited through near field ii communications (NFC) technology connected to a mechatronic release system. The said system relied on a NFC module and a servo working through an Arduino microcontroller coded in the Arduino IDE. To track rentals and maintain the system, each bike is fitted with an XBee module to communicate with a scalable ZigBee mesh network. The network allows for bidirectional, real-time communication with a Meteor.js web application, which enables user and administrator functions through an intuitive user interface available on mobile and desktop. The development of an independent smartlock to replace bike stations is essential to meet the needs of the modern college student. With the goal of creating a bikeshare that better serves college students, Team BIKES has laid the framework for a system that is affordable, easily adaptable, and implementable on any university expressing an interest in bringing a bikeshare to its campus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose three research problems to explore the relations between trust and security in the setting of distributed computation. In the first problem, we study trust-based adversary detection in distributed consensus computation. The adversaries we consider behave arbitrarily disobeying the consensus protocol. We propose a trust-based consensus algorithm with local and global trust evaluations. The algorithm can be abstracted using a two-layer structure with the top layer running a trust-based consensus algorithm and the bottom layer as a subroutine executing a global trust update scheme. We utilize a set of pre-trusted nodes, headers, to propagate local trust opinions throughout the network. This two-layer framework is flexible in that it can be easily extensible to contain more complicated decision rules, and global trust schemes. The first problem assumes that normal nodes are homogeneous, i.e. it is guaranteed that a normal node always behaves as it is programmed. In the second and third problems however, we assume that nodes are heterogeneous, i.e, given a task, the probability that a node generates a correct answer varies from node to node. The adversaries considered in these two problems are workers from the open crowd who are either investing little efforts in the tasks assigned to them or intentionally give wrong answers to questions. In the second part of the thesis, we consider a typical crowdsourcing task that aggregates input from multiple workers as a problem in information fusion. To cope with the issue of noisy and sometimes malicious input from workers, trust is used to model workers' expertise. In a multi-domain knowledge learning task, however, using scalar-valued trust to model a worker's performance is not sufficient to reflect the worker's trustworthiness in each of the domains. To address this issue, we propose a probabilistic model to jointly infer multi-dimensional trust of workers, multi-domain properties of questions, and true labels of questions. Our model is very flexible and extensible to incorporate metadata associated with questions. To show that, we further propose two extended models, one of which handles input tasks with real-valued features and the other handles tasks with text features by incorporating topic models. Our models can effectively recover trust vectors of workers, which can be very useful in task assignment adaptive to workers' trust in the future. These results can be applied for fusion of information from multiple data sources like sensors, human input, machine learning results, or a hybrid of them. In the second subproblem, we address crowdsourcing with adversaries under logical constraints. We observe that questions are often not independent in real life applications. Instead, there are logical relations between them. Similarly, workers that provide answers are not independent of each other either. Answers given by workers with similar attributes tend to be correlated. Therefore, we propose a novel unified graphical model consisting of two layers. The top layer encodes domain knowledge which allows users to express logical relations using first-order logic rules and the bottom layer encodes a traditional crowdsourcing graphical model. Our model can be seen as a generalized probabilistic soft logic framework that encodes both logical relations and probabilistic dependencies. To solve the collective inference problem efficiently, we have devised a scalable joint inference algorithm based on the alternating direction method of multipliers. The third part of the thesis considers the problem of optimal assignment under budget constraints when workers are unreliable and sometimes malicious. In a real crowdsourcing market, each answer obtained from a worker incurs cost. The cost is associated with both the level of trustworthiness of workers and the difficulty of tasks. Typically, access to expert-level (more trustworthy) workers is more expensive than to average crowd and completion of a challenging task is more costly than a click-away question. In this problem, we address the problem of optimal assignment of heterogeneous tasks to workers of varying trust levels with budget constraints. Specifically, we design a trust-aware task allocation algorithm that takes as inputs the estimated trust of workers and pre-set budget, and outputs the optimal assignment of tasks to workers. We derive the bound of total error probability that relates to budget, trustworthiness of crowds, and costs of obtaining labels from crowds naturally. Higher budget, more trustworthy crowds, and less costly jobs result in a lower theoretical bound. Our allocation scheme does not depend on the specific design of the trust evaluation component. Therefore, it can be combined with generic trust evaluation algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation explores why some states consistently secure food imports at prices higher than the world market price, thereby exacerbating food insecurity domestically. I challenge the idea that free market economics alone can explain these trade behaviors, and instead argue that states take into account political considerations when engaging in food trade that results in inefficient trade. In particular, states that are dependent on imports of staple food products, like cereals, are wary of the potential strategic value of these goods to exporters. I argue that this consideration, combined with the importing state’s ability to mitigate that risk through its own forms of political or economic leverage, will shape the behavior of the importing state and contribute to its potential for food security. In addition to cross-national analyses, I use case studies of the Gulf Cooperation Council states and Jordan to demonstrate how the political tools available to these importers affect their food security. The results of my analyses suggest that when import dependent states have access to forms of political leverage, they are more likely to trade efficiently, thereby increasing their potential for food security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem: Around 300 million people worldwide have asthma and prevalence is increasing. Support for optimal self-management can be effective in improving a range of outcomes and is cost effective, but is underutilised as a treatment strategy. Supporting optimum self-management using digital technology shows promise, but how best to do this is not clear. Aim: The purpose of this project was to explore the potential role of a digital intervention in promoting optimum self-management in adults with asthma. Methods: Following the MRC Guidance on the Development and Evaluation of Complex Interventions which advocates using theory, evidence, user testing and appropriate modelling and piloting, this project had 3 phases. Phase 1: Examination of the literature to inform phases 2 and 3, using systematic review methods and focussed literature searching. Phase 2: Developing the Living Well with Asthma website. A prototype (paper-based) version of the website was developed iteratively with input from a multidisciplinary expert panel, empirical evidence from the literature (from phase 1), and potential end users via focus groups (adults with asthma and practice nurses). Implementation and behaviour change theories informed this process. The paper-based designs were converted to the website through an iterative user centred process (think aloud studies with adults with asthma). Participants considered contents, layout, and navigation. Development was agile using feedback from the think aloud sessions immediately to inform design and subsequent think aloud sessions. Phase 3: A pilot randomised controlled trial over 12 weeks to evaluate the feasibility of a Phase 3 trial of Living Well with Asthma to support self-management. Primary outcomes were 1) recruitment & retention; 2) website use; 3) Asthma Control Questionnaire (ACQ) score change from baseline; 4) Mini Asthma Quality of Life (AQLQ) score change from baseline. Secondary outcomes were patient activation, adherence, lung function, fractional exhaled nitric oxide (FeNO), generic quality of life measure (EQ-5D), medication use, prescribing and health services contacts. Results: Phase1: Demonstrated that while digital interventions show promise, with some evidence of effectiveness in certain outcomes, participants were poorly characterised, telling us little about the reach of these interventions. The interventions themselves were poorly described making drawing definitive conclusions about what worked and what did not impossible. Phase 2: The literature indicated that important aspects to cover in any self-management intervention (digital or not) included: asthma action plans, regular health professional review, trigger avoidance, psychological functioning, self-monitoring, inhaler technique, and goal setting. The website asked users to aim to be symptom free. Key behaviours targeted to achieve this include: optimising medication use (including inhaler technique); attending primary care asthma reviews; using asthma action plans; increasing physical activity levels; and stopping smoking. The website had 11 sections, plus email reminders, which promoted these behaviours. Feedback during think aloud studies was mainly positive with most changes focussing on clarification of language, order of pages and usability issues mainly relating to navigation difficulties. Phase 3: To achieve our recruitment target 5383 potential participants were invited, leading to 51 participants randomised (25 to intervention group). Age range 16-78 years; 75% female; 28% from most deprived quintile. Nineteen (76%) of the intervention group used the website for an average of 23 minutes. Non-significant improvements in favour of the intervention group observed in the ACQ score (-0.36; 95% confidence interval: -0.96, 0.23; p=0.225), and mini-AQLQ scores (0.38; -0.13, 0.89; p=0.136). A significant improvement was observed in the activity limitation domain of the mini-AQLQ (0.60; 0.05 to 1.15; p = 0.034). Secondary outcomes showed increased patient activation and reduced reliance on reliever medication. There was no significant difference in the remaining secondary outcomes. There were no adverse events. Conclusion: Living Well with Asthma has been shown to be acceptable to potential end users, and has potential for effectiveness. This intervention merits further development, and subsequent evaluation in a Phase III full scale RCT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este documento descreve o trabalho realizado em conjunto com a empresa MedSUPPORT[1] no desenvolvimento de uma plataforma digital para análise da satisfação dos utentes de unidades de saúde. Atualmente a avaliação de satisfação junto dos seus clientes é um procedimento importante e que deve ser utilizado pelas empresas como mais uma ferramenta de avaliação dos seus produtos ou serviços. Para as unidades de saúde a avaliação da satisfação do utente é atualmente considerada como um objetivo fundamental dos serviços de saúde e tem vindo a ocupar um lugar progressivamente mais importante na avaliação da qualidade dos mesmos. Neste âmbito idealizou-se desenvolver uma plataforma digital para análise da satisfação dos utentes de unidades de saúde. O estudo inicial sobre o conceito da satisfação de consumidores e utentes permitiu consolidar os conceitos associados à temática em estudo. Conhecer as oito dimensões que, de acordo com os investigadores englobam a satisfação do utente é um dos pontos relevantes do estudo inicial. Para avaliar junto do utente a sua satisfação é necessário questiona-lo diretamente. Para efeito desenvolveu-se um inquérito de satisfação estudando cuidadosamente cada um dos elementos que deste fazem parte. No desenvolvimento do inquérito de satisfação foram seguidas as seguintes etapas: Planeamento do questionário, partindo das oito dimensões da satisfação do utente até às métricas que serão avaliadas junto do utente; Análise dos dados a recolher, definindo-se, para cada métrica, se os dados serão nominais, ordinais ou provenientes de escalas balanceadas; Por último a formulação das perguntas do inquérito de satisfação foi alvo de estudo cuidado para garantir que o utente percecione da melhor forma o objetivo da questão. A definição das especificações da plataforma e do questionário passou por diferentes estudos, entre eles uma análise de benchmarking[2], que permitiram definir que o inquérito iv estará localizado numa zona acessível da unidade de saúde, será respondido com recurso a um ecrã táctil (tablet) e que estará alojado na web. As aplicações web desenvolvidas atualmente apresentam um design apelativo e intuitivo. Foi fundamental levar a cabo um estudo do design da aplicação web, como garantia que as cores utilizadas, o tipo de letra, e o local onde a informação são os mais adequados. Para desenvolver a aplicação web foi utilizada a linguagem de programação Ruby, com recurso à framework Ruby on Rails. Para a implementação da aplicação foram estudadas as diferentes tecnologias disponíveis, com enfoque no estudo do sistema de gestão de base de dados a utilizar. O desenvolvimento da aplicação web teve também como objetivo melhorar a gestão da informação gerada pelas respostas ao inquérito de satisfação. O colaborador da MedSUPPORT é o responsável pela gestão da informação pelo que as suas necessidades foram atendidas. Um menu para a gestão da informação é disponibilizado ao administrador da aplicação, colaborador MedSUPPORT. O menu de gestão da informação permitirá uma análise simplificada do estado atual com recurso a um painel do tipo dashboard e, a fim de melhorar a análise interna dos dados terá uma função de exportação dos dados para folha de cálculo. Para validação do estudo efetuado foram realizados os testes de funcionamento à plataforma, tanto à sua funcionalidade como à sua utilização em contexto real pelos utentes inquiridos nas unidades de saúde. Os testes em contexto real objetivaram validar o conceito junto dos utentes inquiridos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La presente tesis “Estudio de la versión digital de Diario El Tiempo: análisis y propuesta de innovación” tiene como objetivo analizar el sitio web de Diario El Tiempo de Cuenca para comprobar si cumple las expectativas y necesidades de los usuarios, revisar si aprovecha las posibilidades multimedia del mundo digital, y por último, presentar una propuesta para mejorar la edición digital. En el Capítulo I se revisan los conceptos generales de comunicación y periodismo digital, y sus máximos referentes a nivel mundial. El Capítulo II describe brevemente la historia del periodismo en el Ecuador, los sitios web de los principales medios impresos del país, y las nuevas narrativas multimedia. El Capítulo III analiza las características de eltiempo.com.ec, como el diseño, contenido, usabilidad, accesibilidad, arquitectura de información, recursos audiovisuales y multimedia, interacción comunicativa con los usuarios, redes sociales, por último lo compara con otros medios digitales como El Comercio y El Mercurio. El Capítulo IV muestra los resultados de la encuesta realizada para conocer el nivel de aceptación del sitio entre los internautas. Finalmente, el Capítulo V presenta una propuesta de innovación para el sitio y el personal que trabaja en la elaboración de contenidos y mantenimiento del sitio, junto a un manual de redacción web y uso de redes sociales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Secure computation involves multiple parties computing a common function while keeping their inputs private, and is a growing field of cryptography due to its potential for maintaining privacy guarantees in real-world applications. However, current secure computation protocols are not yet efficient enough to be used in practice. We argue that this is due to much of the research effort being focused on generality rather than specificity. Namely, current research tends to focus on constructing and improving protocols for the strongest notions of security or for an arbitrary number of parties. However, in real-world deployments, these security notions are often too strong, or the number of parties running a protocol would be smaller. In this thesis we make several steps towards bridging the efficiency gap of secure computation by focusing on constructing efficient protocols for specific real-world settings and security models. In particular, we make the following four contributions: - We show an efficient (when amortized over multiple runs) maliciously secure two-party secure computation (2PC) protocol in the multiple-execution setting, where the same function is computed multiple times by the same pair of parties. - We improve the efficiency of 2PC protocols in the publicly verifiable covert security model, where a party can cheat with some probability but if it gets caught then the honest party obtains a certificate proving that the given party cheated. - We show how to optimize existing 2PC protocols when the function to be computed includes predicate checks on its inputs. - We demonstrate an efficient maliciously secure protocol in the three-party setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The past several years have seen the surprising and rapid rise of Bitcoin and other “cryptocurrencies.” These are decentralized peer-to-peer networks that allow users to transmit money, tocompose financial instruments, and to enforce contracts between mutually distrusting peers, andthat show great promise as a foundation for financial infrastructure that is more robust, efficientand equitable than ours today. However, it is difficult to reason about the security of cryptocurrencies. Bitcoin is a complex system, comprising many intricate and subtly-interacting protocol layers. At each layer it features design innovations that (prior to our work) have not undergone any rigorous analysis. Compounding the challenge, Bitcoin is but one of hundreds of competing cryptocurrencies in an ecosystem that is constantly evolving. The goal of this thesis is to formally reason about the security of cryptocurrencies, reining in their complexity, and providing well-defined and justified statements of their guarantees. We provide a formal specification and construction for each layer of an abstract cryptocurrency protocol, and prove that our constructions satisfy their specifications. The contributions of this thesis are centered around two new abstractions: “scratch-off puzzles,” and the “blockchain functionality” model. Scratch-off puzzles are a generalization of the Bitcoin “mining” algorithm, its most iconic and novel design feature. We show how to provide secure upgrades to a cryptocurrency by instantiating the protocol with alternative puzzle schemes. We construct secure puzzles that address important and well-known challenges facing Bitcoin today, including wasted energy and dangerous coalitions. The blockchain functionality is a general-purpose model of a cryptocurrency rooted in the “Universal Composability” cryptography theory. We use this model to express a wide range of applications, including transparent “smart contracts” (like those featured in Bitcoin and Ethereum), and also privacy-preserving applications like sealed-bid auctions. We also construct a new protocol compiler, called Hawk, which translates user-provided specifications into privacy-preserving protocols based on zero-knowledge proofs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Em plena quarta revolução industrial, todas as industrias se estão a transformar para se ajustar aos novos paradigmas de relação com os clientes, altamente influenciados pelos pioneiros digitais como a Uber, Netflix ou Amazon, porém no setor financeiro há desafios acrescidos, pois os clientes esperam juntar essas expectativas digitais com a manutenção da iteração humana, enquanto, do lados bancos, em simultâneo, necessitam de recuperar da crise da dívida soberana que impôs necessidades de ajustamento dos balanços. O momento de desenvolvimento tecnológico potenciado pelo forte crescimento do acesso à internet em mobilidade traz novos hábitos e expectativas na relação com as entidades, com dispositivos cada vez mais potentes a cada vez menor custo, o que criou a oportunidade perfeita para o surgimento de startups tecnológicas dispostas a transformar os modelos de negócio de intermediação clássica, dando origem, no setor financeiro, às fintechs – empresas de base tecnológica dedicadas à prestação de serviços financeiros - impondo uma disrupção na industria financeira, com destaque para mercados como os EUA e Reino Unido. Olhando aos últimos cinco anos do setor financeiro, será muito difícil antecipar como estará o setor financeiro dentro de cinco anos, mas sabemos que estará seguramente muito diferente do que conhecemos hoje, por esse fato este trabalho é assente essencialmente em referências bibliográficas dos últimos 5 anos, tendo sido feito utilizados estudos de investigação de empresas e documentos académicos para a caracterização do setor neste contexto de inovação permanente e em que medida este processo de “digitalização” do setor financeiro influencia a propensão dos clientes na contratação de mais produtos e serviços, sendo esse um fator central para os bancos em Portugal recuperarem economicamente. É também analisada a dimensão seguida pelas instituições de regulação e supervisão do setor financeiro com vista a potenciar a concorrência e inovação do setor financeiro, enquanto mantém a garantia de segurança, confiança e controlo de risco sistémico. É bastante escassa a literatura disponível para caracterizar a banca em Portugal numa ótica de inovação e transformação, porém este trabalho procura caracterizar o sistema financeiro português face à forma como está a responder aos desafios de transformação tecnológica e digital. Procurou-se estabelecer uma metodologia de investigação que permita caracterizar a perceção de valor acrescentado para os clientes da utilização de serviços digitais e em que medida estes se podem substituir aos balcões e à intervenção humana dos profissionais dos bancos, tendo-se concluído que estes dois elementos são ainda fatores centrais para os clientes.