797 resultados para Data-communication


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Big data nowadays is a fashionable topic, independently of what people mean when they use this term. But being big is just a matter of volume, although there is no clear agreement in the size threshold. On the other hand, it is easy to capture large amounts of data using a brute force approach. So the real goal should not be big data but to ask ourselves, for a given problem, what is the right data and how much of it is needed. For some problems this would imply big data, but for the majority of the problems much less data will and is needed. In this talk we explore the trade-offs involved and the main problems that come with big data using the Web as case study: scalability, redundancy, bias, noise, spam, and privacy. Speaker Biography Ricardo Baeza-Yates Ricardo Baeza-Yates is VP of Research for Yahoo Labs leading teams in United States, Europe and Latin America since 2006 and based in Sunnyvale, California, since August 2014. During this time he has lead the labs in Barcelona and Santiago de Chile. Between 2008 and 2012 he also oversaw the Haifa lab. He is also part time Professor at the Dept. of Information and Communication Technologies of the Universitat Pompeu Fabra, in Barcelona, Spain. During 2005 he was an ICREA research professor at the same university. Until 2004 he was Professor and before founder and Director of the Center for Web Research at the Dept. of Computing Science of the University of Chile (in leave of absence until today). He obtained a Ph.D. in CS from the University of Waterloo, Canada, in 1989. Before he obtained two masters (M.Sc. CS & M.Eng. EE) and the electronics engineer degree from the University of Chile in Santiago. He is co-author of the best-seller Modern Information Retrieval textbook, published in 1999 by Addison-Wesley with a second enlarged edition in 2011, that won the ASIST 2012 Book of the Year award. He is also co-author of the 2nd edition of the Handbook of Algorithms and Data Structures, Addison-Wesley, 1991; and co-editor of Information Retrieval: Algorithms and Data Structures, Prentice-Hall, 1992, among more than 500 other publications. From 2002 to 2004 he was elected to the board of governors of the IEEE Computer Society and in 2012 he was elected for the ACM Council. He has received the Organization of American States award for young researchers in exact sciences (1993), the Graham Medal for innovation in computing given by the University of Waterloo to distinguished ex-alumni (2007), the CLEI Latin American distinction for contributions to CS in the region (2009), and the National Award of the Chilean Association of Engineers (2010), among other distinctions. In 2003 he was the first computer scientist to be elected to the Chilean Academy of Sciences and since 2010 is a founding member of the Chilean Academy of Engineering. In 2009 he was named ACM Fellow and in 2011 IEEE Fellow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation studies the effects of Information and Communication Technologies (ICT) on the banking sector and the payments system. It provides insight into how technology-induced changes occur, by exploring both the nature and scope of main technology innovations and evidencing their economic implications for banks and payment systems. Some parts in the dissertation are descriptive. They summarise the main technological developments in the field of finance and link them to economic policies. These parts are complemented with sections of the study that focus on assessing the extent of technology application to banking and payment activities. Finally, it includes also some work which borrows from the economic literature on banking. The need for an interdisciplinary approach arises from the complexity of the topic and the rapid path of change to which it is subject. The first chapter provides an overview of the influence of developments in ICT on the evolution of financial services and international capital flows. We include main indicators and discuss innovation in the financial sector, exchange rates and international capital flows. The chapter concludes with impact analysis and policy options regarding the international financial architecture, some monetary policy issues and the role of international institutions. The second chapter is a technology assessment study that focuses on the relationship between technology and money. The application of technology to payments systems is transforming the way we use money and, in some instances, is blurring the definition of what constitutes money. This chapter surveys the developments in electronic forms of payment and their relationship to the banking system. It also analyses the challenges posed by electronic money for regulators and policy makers, and in particular the opportunities created by two simultaneous processes: the Economic and Monetary Union and the increasing use of electronic payment instruments. The third chapter deals with the implications of developments in ICT on relationship banking. The financial intermediation literature explains relationship banking as a type of financial intermediation characterised by proprietary information and multiple interactions with customers. This form of banking is important for the financing of small and medium-sized enterprises. We discuss the effects of ICT on the banking sector as a whole and then apply these developments to the case of relationship banking. The fourth chapter is an empirical study of the effects of technology on the banking business, using a sample of data from the Spanish banking industry. The design of the study is based on some of the events described in the previous chapters, and also draws from the economic literature on banking. The study shows that developments in information management have differential effects on wholesale and retail banking activities. Finally, the last chapter is a technology assessment study on electronic payments systems in Spain and the European Union. It contains an analysis of existing payment systems and ongoing or planned initiatives in Spain. It forms part of a broader project comprising a series of country-specific analyses covering ten European countries. The main issues raised across the countries serve as the starting point to discuss implications of the development of electronic money for regulation and policies, and in particular, for monetary-policy making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined the acoustical conditions, including the surface-dimension measurements, background noise levels, and reverberation times in classrooms in a metropolitan area. The data collected in this study will help school administrators realize that appropriate classroom acoustics are necessary for both hearing impaired and normal hearing students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a general Multi-Agent System framework for distributed data mining based on a Peer-to-Peer model. Agent protocols are implemented through message-based asynchronous communication. The framework adopts a dynamic load balancing policy that is particularly suitable for irregular search algorithms. A modular design allows a separation of the general-purpose system protocols and software components from the specific data mining algorithm. The experimental evaluation has been carried out on a parallel frequent subgraph mining algorithm, which has shown good scalability performances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The creation of OFDM based Wireless Personal Area Networks (WPANs) has allowed the development of high bit-rate wireless communication devices suitable for streaming High Definition video between consumer products, as demonstrated in Wireless-USB and Wireless-HDMI. However, these devices need high frequency clock rates, particularly for the OFDM, FFT and symbol processing sections resulting in high silicon cost and high electrical power. The high clock rates make hardware prototyping difficult and verification is therefore very important but costly. Acknowledging that electrical power in wireless consumer devices is more critical than the number of implemented logic gates, this paper presents a Double Data Rate (DDR) architecture for implementation inside a OFDM baseband codec in order to reduce the high frequency clock rates by a complete factor of 2. The presented architecture has been implemented and tested for ECMA-368 (Wireless- USB context) resulting in a maximum clock rate of 264MHz instead of the expected 528MHz clock rate existing anywhere on the baseband codec die.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The creation of OFDM based Wireless Personal Area Networks (WPANs) has allowed high bit-rate wireless communication devices suitable for streaming High Definition video between consumer products as demonstrated in Wireless- USB. However, these devices need high clock rates, particularly for the OFDM sections resulting in high silicon cost and high electrical power. Acknowledging that electrical power in wireless consumer devices is more critical than the number of implemented logic gates, this paper presents a Double Data Rate (DDR) architecture to reduce the OFDM input and output clock rate by a factor of 2. The architecture has been implemented and tested for Wireless-USB (ECMA-368) resulting in a maximum clock of 264MHz instead of 528MHz existing anywhere on the die.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To examine the properties of the Social Communication Questionnaire (SCQ) in a population cohort of children with autism spectrum disorders (ASDs) and in the general population, Method: SCQ data were collected from three samples: the Special Needs and Autism Project (SNAP) cohort of 9- to 10-year-old children with special educational needs with and without ASD and two similar but separate age groups of children from the general population (n = 411 and n = 247). Diagnostic assessments were completed on a stratified subsample (n = 255) of the special educational needs group. A sample-weighting procedure enabled us to estimate characteristics of the SCQ in the total ASD population. Diagnostic status of cases in the general population samples were extracted from child health records. Results: The SCQ showed strong discrimination between ASD and non-ASD cases (sensitivity 0.88, specificity 0.72) and between autism and nonautism cases (sensitivity 0.90, specificity 0.86). Findings were not affected by child IQ or parental education. In the general population samples between 4% and 5% of children scored above the ASD cutoff including 1.5% who scored above the autism cutoff. Although many of these high-scoring children had an ASD diagnosis, almost all (similar to 90%) of them had a diagnosed neurodevelopmental disorder. Conclusions: This study confirms the utility of the SCQ as a,first-level screen for ASD in at-risk samples of school-age children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new blind equalisation algorithm for the pulse amplitude modulation (PAM) data transmitted through nonminimum phase (NMP) channels. The algorithm itself is based on a noncausal AR model of communication channels and the second- and fourth-order cumulants of the received data series, where only the diagonal slices of cumulants are used. The AR parameters are adjusted at each sample by using a successive over-relaxation (SOR) scheme, a variety of the ordinary LMS scheme, but with a faster convergence rate and a greater robustness to the selection of the ‘step-size’ in iterations. Computer simulations are implemented for both linear time-invariant (LTI) and linear time-variant (LTV) NMP channels, and the results show that the algorithm proposed in this paper has a fast convergence rate and a potential capability to track the LTV NMP channels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on a study of computer-mediated communication within the context of a distance MA in TEFL programme which used an e-mail discussion list and then a discussion board. The study focused on the computer/Internet access and skills of the target population and their CMC needs and wants. Data were collected from 63 questionnaires and 6 in-depth interviews with students. Findings indicate that computer use and access to the Internet are widespread within the target population. In addition, most respondents indicated some competence in Internet use. No single factor emerged as an overriding inhibiting factor for lack of personal use. There was limited use of the CMC tools provided on the course for student–student interaction, mainly attributable to time constraints. However, most respondents said that they would like more CMC interaction with tutors. The main factor which would contribute to greater Internet use was training. The paper concludes with recommendations and suggestions for learner training in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new generation of advanced surveillance systems is being conceived as a collection of multi-sensor components such as video, audio and mobile robots interacting in a cooperating manner to enhance situation awareness capabilities to assist surveillance personnel. The prominent issues that these systems face are: the improvement of existing intelligent video surveillance systems, the inclusion of wireless networks, the use of low power sensors, the design architecture, the communication between different components, the fusion of data emerging from different type of sensors, the location of personnel (providers and consumers) and the scalability of the system. This paper focuses on the aspects pertaining to real-time distributed architecture and scalability. For example, to meet real-time requirements, these systems need to process data streams in concurrent environments, designed by taking into account scheduling and synchronisation. The paper proposes a framework for the design of visual surveillance systems based on components derived from the principles of Real Time Networks/Data Oriented Requirements Implementation Scheme (RTN/DORIS). It also proposes the implementation of these components using the well-known middleware technology Common Object Request Broker Architecture (CORBA). Results using this architecture for video surveillance are presented through an implemented prototype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improved udder health requires consistent application of appropriate management practices by those involved in managing dairy herds and the milking process. Designing effective communication requires that we understand why dairy herd managers behave in the way they do and also how the means of communication can be used both to inform and to influence. Social sciences- ranging from economics to anthropology - have been used to shed light on the behaviour of those who manage farm animals. Communication science tells us that influencing behaviour is not simply a question of „getting the message across‟ but of addressing the complex of factors that influence an individual‟s behavioural decisions. A review of recent studies in the animal health literature shows that different social science frameworks and methodologies offer complementary insights into livestock managers‟ behaviour but that the diversity of conceptual and methodological frameworks presents a challenge for animal health practitioners and policy makers who seek to make sense of the findings – and for researchers looking for helpful starting points. Data from a recent study in England illustrate the potential of „home-made‟ conceptual frameworks to help unravel the complexity of farmer behaviour. At the same time, though, the data indicate the difficulties facing those designing communication strategies in a context where farmers believe strongly that they are already doing all they can reasonably be expected to do to minimise animal health risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently major processor manufacturers have announced a dramatic shift in their paradigm to increase computing power over the coming years. Instead of focusing on faster clock speeds and more powerful single core CPUs, the trend clearly goes towards multi core systems. This will also result in a paradigm shift for the development of algorithms for computationally expensive tasks, such as data mining applications. Obviously, work on parallel algorithms is not new per se but concentrated efforts in the many application domains are still missing. Multi-core systems, but also clusters of workstations and even large-scale distributed computing infrastructures provide new opportunities and pose new challenges for the design of parallel and distributed algorithms. Since data mining and machine learning systems rely on high performance computing systems, research on the corresponding algorithms must be on the forefront of parallel algorithm research in order to keep pushing data mining and machine learning applications to be more powerful and, especially for the former, interactive. To bring together researchers and practitioners working in this exciting field, a workshop on parallel data mining was organized as part of PKDD/ECML 2006 (Berlin, Germany). The six contributions selected for the program describe various aspects of data mining and machine learning approaches featuring low to high degrees of parallelism: The first contribution focuses the classic problem of distributed association rule mining and focuses on communication efficiency to improve the state of the art. After this a parallelization technique for speeding up decision tree construction by means of thread-level parallelism for shared memory systems is presented. The next paper discusses the design of a parallel approach for dis- tributed memory systems of the frequent subgraphs mining problem. This approach is based on a hierarchical communication topology to solve issues related to multi-domain computational envi- ronments. The forth paper describes the combined use and the customization of software packages to facilitate a top down parallelism in the tuning of Support Vector Machines (SVM) and the next contribution presents an interesting idea concerning parallel training of Conditional Random Fields (CRFs) and motivates their use in labeling sequential data. The last contribution finally focuses on very efficient feature selection. It describes a parallel algorithm for feature selection from random subsets. Selecting the papers included in this volume would not have been possible without the help of an international Program Committee that has provided detailed reviews for each paper. We would like to also thank Matthew Otey who helped with publicity for the workshop.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The themes of awareness and influence within the innovation diffusion process are addressed. The innovation diffusion process is typically represented as stages, yet awareness and influence are somewhat under-represented in the literature. Awareness and influence are situated within the contextual setting of individual actors but also within the broader institutional forces. Understanding how actors become aware of an innovation and then how their opinion is influenced is important for creating a more innovation-active UK construction sector. Social network analysis is proposed as one technique for mapping how awareness and influence occur and what they look like as a network. Empirical data are gathered using two modes of enquiry. This is done through a pilot study consisting of chartered professionals and then through a case study organization as it attempted to diffuse an innovation. The analysis demonstrates significant variations across actors’ awareness and influence networks. It is argued that social network analysis can complement other research methods in order to present a richer picture of how actors become aware of innovations and where they draw their influences regarding adopting innovations. In summarizing the findings, a framework for understanding awareness and influence associated with innovation within the UK construction sector is presented. Finally, with the UK construction sector continually being encouraged to be innovative, understanding and managing an actor’s awareness and influence network will be beneficial. The overarching conclusion thus describes the need not only to build research capacity in this area but also to push the boundaries related to the research methods employed.