797 resultados para Data-communication
Resumo:
In the wake of the disclosures surrounding PRISM and other US surveillance programmes, this paper assesses the large-scale surveillance practices by a selection of EU member states: the UK, Sweden, France, Germany and the Netherlands. Given the large-scale nature of these practices, which represent a reconfiguration of traditional intelligence gathering, the paper contends that an analysis of European surveillance programmes cannot be reduced to a question of the balance between data protection versus national security, but has to be framed in terms of collective freedoms and democracy. It finds that four of the five EU member states selected for in-depth examination are engaging in some form of large-scale interception and surveillance of communication data, and identifies parallels and discrepancies between these programmes and the NSA-run operations. The paper argues that these programmes do not stand outside the realm of EU intervention but can be analysed from an EU law perspective via i) an understanding of national security in a democratic rule of law framework where fundamental human rights and judicial oversight constitute key norms; ii) the risks posed to the internal security of the Union as a whole as well as the privacy of EU citizens as data owners and iii) the potential spillover into the activities and responsibilities of EU agencies. The paper then presents a set of policy recommendations to the European Parliament.
Resumo:
Grazie all'enorme sviluppo dei LED, le comunicazioni tramite luce visibile stanno acquisendo sempre maggiore importanza. Obiettivo di questa tesi è implementare su schede a basso costo (come Rasperry Pi) un sistema di trasmissione e ricezione basato appunto sulla visible light communication. Dopo un primo tentativo di trasferire il codice OpenVLC, sviluppato dal centro di ricerca spagnolo IMDEA Network, su Rasperry Pi, si è deciso di intraprendere una nuova strada e si è implementato un trasmettitore VLC in Simulink di Matlab e una prima bozza di ricevitore che sfrutta la SPI (serial-parallel interface). I primi risultati mostrano il corretto funzionamento del sistema anche se con data rate molto basse. Sviluppi futuri prevederanno l'ottimizzazione del sistema.
Resumo:
In this paper a utilization of the high data-rates channels by threading of sending and receiving is studied. As a communication technology evolves the higher speeds are used more and more in various applications. But generating traffic with Gbps data-rates also brings some complications. Especially if UDP protocol is used and it is necessary to avoid packet fragmentation, for example for high-speed reliable transport protocols based on UDP. For such situation the Ethernet network packet size has to correspond to standard 1500 bytes MTU[1], which is widely used in the Internet. System may not has enough capacity to send messages with necessary rate in a single-threaded mode. A possible solution is to use more threads. It can be efficient on widespread multicore systems. Also the fact that in real network non-constant data flow can be expected brings another object of study –- an automatic adaptation to the traffic which is changing during runtime. Cases investigated in this paper include adjusting number of threads to a given speed and keeping speed on a given rate when CPU gets heavily loaded by other processes while sending data.
Resumo:
Vol. 2 by Harold D. Becker and John G. Lawton.
Resumo:
Cover title.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
This study explored the impact of downsizing on levels of uncertainty, coworker and management trust, and communicative effectiveness in a health care organization downsizing during a 2-year period from 660 staff to 350 staff members. Self-report data were obtained from employees who were staying (survivors), from employees were being laid off (victims), and from employees with and without managerial responsibilities. Results indicated that downsizing had a similar impact on the amount of trust that survivors and victims had for management. However, victims reported feeling lower levels of trust toward their colleagues compared with survivors. Contrary to expectations, survivors and victims reported similar perceptions of job and organizational uncertainty and similar levels of information received about changes. Employees with no management responsibilities and middle managers both reported lower scores than did senior managers on all aspects of information received. Implications for practice and the management of the communication process are discussed.
Resumo:
As humans expand into space communities will form. These have already begun to form in small ways, such as long-duration missions on the International Space Station and the space shuttle, and small-scale tourist excursions into space. Social, behavioural and communications data emerging from such existing communities in space suggest that the physically-bounded, work-oriented and traditionally male-dominated nature of these extremely remote groups present specific problems for the resident astronauts, groups of them viewed as ‘communities’, and their associated groups who remain on Earth, including mission controllers, management and astronauts’ families. Notionally feminine group attributes such as adaptive competence, social adaptation skills and social sensitivity will be crucial to the viability of space communities and in the absence of gender equity, ‘staying in touch’ by means of ‘news from home’ becomes more important than ever. A template of news and media forms and technologies is suggested to service those needs and enhance the social viability of future terraforming activities.
Resumo:
Networked information and communication technologies are rapidly advancing the capacities of governments to target and separately manage specific sub-populations, groups and individuals. Targeting uses data profiling to calculate the differential probabilities of outcomes associated with various personal characteristics. This knowledge is used to classify and sort people for differentiated levels of treatment. Targeting is often used to efficiently and effectively target government resources to the most disadvantaged. Although having many benefits, targeting raises several policy and ethical issues. This paper discusses these issues and the policy responses governments may take to maximise the benefits of targeting while ameliorating the negative aspects.
Resumo:
This is the third article of a series entitled Astronauts as Audiences. In this article, we investigate the roles that situation awareness (SA), communications, and reality TV (including media communications) might have on the lives of astronauts in remote space communities. We examined primary data about astronauts’ living and working environments, applicable theories of SA, communications, and reality TV (including media communications). We then surmised that the collective application of these roles might be a means of enhancing the lives of astronauts in remote space communities.
Resumo:
Objective: An estimation of cut-off points for the diagnosis of diabetes mellitus (DM) based on individual risk factors. Methods: A subset of the 1991 Oman National Diabetes Survey is used, including all patients with a 2h post glucose load >= 200 mg/dl (278 subjects) and a control group of 286 subjects. All subjects previously diagnosed as diabetic and all subjects with missing data values were excluded. The data set was analyzed by use of the SPSS Clementine data mining system. Decision Tree Learners (C5 and CART) and a method for mining association rules (the GRI algorithm) are used. The fasting plasma glucose (FPG), age, sex, family history of diabetes and body mass index (BMI) are input risk factors (independent variables), while diabetes onset (the 2h post glucose load >= 200 mg/dl) is the output (dependent variable). All three techniques used were tested by use of crossvalidation (89.8%). Results: Rules produced for diabetes diagnosis are: A- GRI algorithm (1) FPG>=108.9 mg/dl, (2) FPG>=107.1 and age>39.5 years. B- CART decision trees: FPG >=110.7 mg/dl. C- The C5 decision tree learner: (1) FPG>=95.5 and 54, (2) FPG>=106 and 25.2 kg/m2. (3) FPG>=106 and =133 mg/dl. The three techniques produced rules which cover a significant number of cases (82%), with confidence between 74 and 100%. Conclusion: Our approach supports the suggestion that the present cut-off value of fasting plasma glucose (126 mg/dl) for the diagnosis of diabetes mellitus needs revision, and the individual risk factors such as age and BMI should be considered in defining the new cut-off value.
Resumo:
In recent years many real time applications need to handle data streams. We consider the distributed environments in which remote data sources keep on collecting data from real world or from other data sources, and continuously push the data to a central stream processor. In these kinds of environments, significant communication is induced by the transmitting of rapid, high-volume and time-varying data streams. At the same time, the computing overhead at the central processor is also incurred. In this paper, we develop a novel filter approach, called DTFilter approach, for evaluating the windowed distinct queries in such a distributed system. DTFilter approach is based on the searching algorithm using a data structure of two height-balanced trees, and it avoids transmitting duplicate items in data streams, thus lots of network resources are saved. In addition, theoretical analysis of the time spent in performing the search, and of the amount of memory needed is provided. Extensive experiments also show that DTFilter approach owns high performance.
Resumo:
Although managers consider accurate, timely, and relevant information as critical to the quality of their decisions, evidence of large variations in data quality abounds. Over a period of twelve months, the action research project reported herein attempted to investigate and track data quality initiatives undertaken by the participating organisation. The investigation focused on two types of errors: transaction input errors and processing errors. Whenever the action research initiative identified non-trivial errors, the participating organisation introduced actions to correct the errors and prevent similar errors in the future. Data quality metrics were taken quarterly to measure improvements resulting from the activities undertaken during the action research project. The action research project results indicated that for a mission-critical database to ensure and maintain data quality, commitment to continuous data quality improvement is necessary. Also, communication among all stakeholders is required to ensure common understanding of data quality improvement goals. The action research project found that to further substantially improve data quality, structural changes within the organisation and to the information systems are sometimes necessary. The major goal of the action research study is to increase the level of data quality awareness within all organisations and to motivate them to examine the importance of achieving and maintaining high-quality data.
Resumo:
Large amounts of information can be overwhelming and costly to process, especially when transmitting data over a network. A typical modern Geographical Information System (GIS) brings all types of data together based on the geographic component of the data and provides simple point-and-click query capabilities as well as complex analysis tools. Querying a Geographical Information System, however, can be prohibitively expensive due to the large amounts of data which may need to be processed. Since the use of GIS technology has grown dramatically in the past few years, there is now a need more than ever, to provide users with the fastest and least expensive query capabilities, especially since an approximated 80 % of data stored in corporate databases has a geographical component. However, not every application requires the same, high quality data for its processing. In this paper we address the issues of reducing the cost and response time of GIS queries by preaggregating data by compromising the data accuracy and precision. We present computational issues in generation of multi-level resolutions of spatial data and show that the problem of finding the best approximation for the given region and a real value function on this region, under a predictable error, in general is "NP-complete.