19 resultados para receiver-based geomulticast protocol


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods: Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion: At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Using Wireless Sensor Networks (WSNs) in healthcare systems has had a lot of attention in recent years. In much of this research tasks like sensor data processing, health states decision making and emergency message sending are done by a remote server. Many patients with lots of sensor data consume a great deal of communication resources, bring a burden to the remote server and delay the decision time and notification time. A healthcare application for elderly people using WSN has been simulated in this paper. A WSN designed for the proposed healthcare application needs efficient MAC and routing protocols to provide a guarantee for the reliability of the data delivered from the patients to the medical centre. Based on these requirements, A cross layer based on the modified versions of APTEEN and GinMAC has been designed and implemented, with new features, such as a mobility module and routes discovery algorithms have been added. Simulation results show that the proposed cross layer based protocol can conserve energy for nodes and provide the required performance such as life time of the network, delay and reliability for the proposed healthcare application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Password Authentication Protocol (PAP) is widely used in the Wireless Fidelity Point-to-Point Protocol to authenticate an identity and password for a peer. This paper uses a new knowledge-based framework to verify the PAP protocol and a fixed version. Flaws are found in both the original and the fixed versions. A new enhanced protocol is provided and the security of it is proved The whole process is implemented in a mechanical reasoning platform, Isabelle. It only takes a few seconds to find flaws in the original and the fixed protocol and to verify that the enhanced version of the PAP protocol is secure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When the orthogonal space-time block code (STBC), or the Alamouti code, is applied on a multiple-input multiple-output (MIMO) communications system, the optimum reception can be achieved by a simple signal decoupling at the receiver. The performance, however, deteriorates significantly in presence of co-channel interference (CCI) from other users. In this paper, such CCI problem is overcome by applying the independent component analysis (ICA), a blind source separation algorithm. This is based on the fact that, if the transmission data from every transmit antenna are mutually independent, they can be effectively separated at the receiver with the principle of the blind source separation. Then equivalently, the CCI is suppressed. Although they are not required by the ICA algorithm itself, a small number of training data are necessary to eliminate the phase and order ambiguities at the ICA outputs, leading to a semi-blind approach. Numerical simulation is also shown to verify the proposed ICA approach in the multiuser MIMO system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many kernel classifier construction algorithms adopt classification accuracy as performance metrics in model evaluation. Moreover, equal weighting is often applied to each data sample in parameter estimation. These modeling practices often become problematic if the data sets are imbalanced. We present a kernel classifier construction algorithm using orthogonal forward selection (OFS) in order to optimize the model generalization for imbalanced two-class data sets. This kernel classifier identification algorithm is based on a new regularized orthogonal weighted least squares (ROWLS) estimator and the model selection criterion of maximal leave-one-out area under curve (LOO-AUC) of the receiver operating characteristics (ROCs). It is shown that, owing to the orthogonalization procedure, the LOO-AUC can be calculated via an analytic formula based on the new regularized orthogonal weighted least squares parameter estimator, without actually splitting the estimation data set. The proposed algorithm can achieve minimal computational expense via a set of forward recursive updating formula in searching model terms with maximal incremental LOO-AUC value. Numerical examples are used to demonstrate the efficacy of the algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper applies O3BPSK (orthogonal on-off PSK) signaling scheme to multipath fading CDMA channels, for the purpose of near-far resistant detection in the reverse link. Based on the maximum multipath spreading delay, a minimum duration of “off” is suggested, with which the temporally adjacent bits (TABs) from different users at the receiver are decoupled. As a result, a Rake-type one-shot linear decorrelating detector (LDD) is obtained. Since no knowledge of echo amplitudes is needed, a blind detection can be realised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Construction planning plays a fundamental role in construction project management that requires team working among planners from a diverse range of disciplines and in geographically dispersed working situations. Model-based four-dimensional (4D) computer-aided design (CAD) groupware, though considered a possible approach to supporting collaborative planning, is still short of effective collaborative mechanisms for teamwork due to methodological, technological and social challenges. Targeting this problem, this paper proposes a model-based groupware solution to enable a group of multidisciplinary planners to perform real-time collaborative 4D planning across the Internet. In the light of the interactive definition method, and its computer-supported collaborative work (CSCW) design analysis, the paper discusses the realization of interactive collaborative mechanisms from software architecture, application mode, and data exchange protocol. These mechanisms have been integrated into a groupware solution, which was validated by a planning team in a truly geographically dispersed condition. Analysis of the validation results revealed that the proposed solution is feasible for real-time collaborative 4D planning to gain a robust construction plan through collaborative teamwork. The realization of this solution triggers further considerations about its enhancement for wider groupware applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gossip (or Epidemic) protocols have emerged as a communication and computation paradigm for large-scale networked systems. These protocols are based on randomised communication, which provides probabilistic guarantees on convergence speed and accuracy. They also provide robustness, scalability, computational and communication efficiency and high stability under disruption. This work presents a novel Gossip protocol named Symmetric Push-Sum Protocol for the computation of global aggregates (e.g., average) in decentralised and asynchronous systems. The proposed approach combines the simplicity of the push-based approach and the efficiency of the push-pull schemes. The push-pull schemes cannot be directly employed in asynchronous systems as they require synchronous paired communication operations to guarantee their accuracy. Although push schemes guarantee accuracy even with asynchronous communication, they suffer from a slower and unstable convergence. Symmetric Push- Sum Protocol does not require synchronous communication and achieves a convergence speed similar to the push-pull schemes, while keeping the accuracy stability of the push scheme. In the experimental analysis, we focus on computing the global average as an important class of node aggregation problems. The results have confirmed that the proposed method inherits the advantages of both other schemes and outperforms well-known state of the art protocols for decentralized Gossip-based aggregation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of calculating the probability of error in a DS/SSMA system has been extensively studied for more than two decades. When random sequences are employed some conditioning must be done before the application of the central limit theorem is attempted, leading to a Gaussian distribution. The authors seek to characterise the multiple access interference as a random-walk with a random number of steps, for random and deterministic sequences. Using results from random-walk theory, they model the interference as a K-distributed random variable and use it to calculate the probability of error in the form of a series, for a DS/SSMA system with a coherent correlation receiver and BPSK modulation under Gaussian noise. The asymptotic properties of the proposed distribution agree with other analyses. This is, to the best of the authors' knowledge, the first attempt to propose a non-Gaussian distribution for the interference. The modelling can be extended to consider multipath fading and general modulation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n = 30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless video sensor networks have been a hot topic in recent years; the monitoring capability is the central feature of the services offered by a wireless video sensor network can be classified into three major categories: monitoring, alerting, and information on-demand. These features have been applied to a large number of applications related to the environment (agriculture, water, forest and fire detection), military, buildings, health (elderly people and home monitoring), disaster relief, area and industrial monitoring. Security applications oriented toward critical infrastructures and disaster relief are very important applications that many countries have identified as critical in the near future. This paper aims to design a cross layer based protocol to provide the required quality of services for security related applications using wireless video sensor networks. Energy saving, delay and reliability for the delivered data are crucial in the proposed application. Simulation results show that the proposed cross layer based protocol offers a good performance in term of providing the required quality of services for the proposed application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environment monitoring applications using Wireless Sensor Networks (WSNs) have had a lot of attention in recent years. In much of this research tasks like sensor data processing, environment states and events decision making and emergency message sending are done by a remote server. A proposed cross layer protocol for two different applications where, reliability for delivered data, delay and life time of the network need to be considered, has been simulated and the results are presented in this paper. A WSN designed for the proposed applications needs efficient MAC and routing protocols to provide a guarantee for the reliability of the data delivered from source nodes to the sink. A cross layer based on the design given in [1] has been extended and simulated for the proposed applications, with new features, such as routes discovery algorithms added. Simulation results show that the proposed cross layer based protocol can conserve energy for nodes and provide the required performance such as life time of the network, delay and reliability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Universal Serial Bus (USB) Mass Storage Device (MSD), often termed a USB flash drive, is ubiquitously used to store important information in unencrypted binary format. This low cost consumer device is incredibly popular due to its size, large storage capacity and relatively high transfer speed. However, if the device is lost or stolen an unauthorized person can easily retrieve all the information. Therefore, it is advantageous in many applications to provide security protection so that only authorized users can access the stored information. In order to provide security protection for a USB MSD, this paper proposes a session key agreement protocol after secure user authentication. The main aim of this protocol is to establish session key negotiation through which all the information retrieved, stored and transferred to the USB MSD is encrypted. This paper not only contributes an efficient protocol, but also does not suffer from the forgery attack and the password guessing attack as compared to other protocols in the literature. This paper analyses the security of the proposed protocol through a formal analysis which proves that the information is stored confidentially and is protected offering strong resilience to relevant security attacks. The computational cost and communication cost of the proposed scheme is analyzed and compared to related work to show that the proposed scheme has an improved tradeoff for computational cost, communication cost and security.