543 resultados para llw (send)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this project we design and implement a centralized hashing table in the snBench sensor network environment. We discuss the feasibility of this approach and compare and contrast with the distributed hashing architecture, with particular discussion regarding the conditions under which a centralized architecture makes sense. There are numerous computational tasks that require persistence of data in a sensor network environment. To help motivate the need for data storage in snBench we demonstrate a practical application of the technology whereby a video camera can monitor a room to detect the presence of a person and send an alert to the appropriate authorities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a prototype of miniaturized, low power, bi-directional wireless sensor node for wireless sensor networks (WSN) was designed for doors and windows building monitoring. The capacitive pressure sensors have been developed particularly for such application, where packaging size and minimization of the power requirements of the sensors are the major drivers. The capacitive pressure sensors have been fabricated using a 2.4 mum thick strain compensated heavily boron doped SiGeB diaphragm is presented. In order to integrate the sensors with the wireless module, the sensor dice was wire bonded onto TO package using chip on board (COB) technology. The telemetric link and its capabilities to send information for longer range have been significantly improved using a new design and optimization process. The simulation tool employed for this work was the Designerreg tool from Ansoft Corporation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

CONCLUSION Radiation dose reduction, while saving image quality could be easily implemented with this approach. Furthermore, the availability of a dosimetric data archive provides immediate feedbacks, related to the implemented optimization strategies. Background JCI Standards and European Legislation (EURATOM 59/2013) require the implementation of patient radiation protection programs in diagnostic radiology. Aim of this study is to demonstrate the possibility to reduce patients radiation exposure without decreasing image quality, through a multidisciplinary team (MT), which analyzes dosimetric data of diagnostic examinations. Evaluation Data from CT examinations performed with two different scanners (Siemens DefinitionTM and GE LightSpeed UltraTM) between November and December 2013 are considered. CT scanners are configured to automatically send images to DoseWatch© software, which is able to store output parameters (e.g. kVp, mAs, pitch ) and exposure data (e.g. CTDIvol, DLP, SSDE). Data are analyzed and discussed by a MT composed by Medical Physicists and Radiologists, to identify protocols which show critical dosimetric values, then suggest possible improvement actions to be implemented. Furthermore, the large amount of data available allows to monitor diagnostic protocols currently in use and to identify different statistic populations for each of them. Discussion We identified critical values of average CTDIvol for head and facial bones examinations (respectively 61.8 mGy, 151 scans; 61.6 mGy, 72 scans), performed with the GE LightSpeed CTTM. Statistic analysis allowed us to identify the presence of two different populations for head scan, one of which was only 10% of the total number of scans and corresponded to lower exposure values. The MT adopted this protocol as standard. Moreover, the constant output parameters monitoring allowed us to identify unusual values in facial bones exams, due to changes during maintenance service, which the team promptly suggested to correct. This resulted in a substantial dose saving in CTDIvol average values of approximately 15% and 50% for head and facial bones exams, respectively. Diagnostic image quality was deemed suitable for clinical use by radiologists.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As announced in the November 2000 issue of MathStats&OR [1], one of the projects supported by the Maths, Stats & OR Network funds is an international survey of research into pedagogic issues in statistics and OR. I am taking the lead on this and report here on the progress that has been made during the first year. A paper giving some background to the project and describing initial thinking on how it might be implemented was presented at the 53rd session of the International Statistical Institute in Seoul, Korea, in August 2001 in a session on The future of statistics education research [2]. It sounded easy. I considered that I was something of an expert on surveys having lectured on the topic for many years and having helped students and others who were doing surveys, particularly with the design of their questionnaires. Surely all I had to do was to draft a few questions, send them electronically to colleagues in statistical education who would be only to happy to respond, and summarise their responses? I should have learnt from my experience of advising all those students who thought that doing a survey was easy and to whom I had to explain that their ideas were too ambitious. There are several inter-related stages in survey research and it is important to think about these before rushing into the collection of data. In the case of the survey in question, this planning stage revealed several challenges. Surveys are usually done for a purpose so even before planning how to do them, it is advisable to think about the final product and the dissemination of results. This is the route I followed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is almost a tradition that celluloid (or digital) villains are represented with some characteristics that remind us the real political enemies of the producer country of the film, or even enemies within the country according to the particular ideology that sustains the film. The case of Christopher Nolan The Dark Knight trilogy, analyzed here, is representative of this trend for two reasons. First, because it gets marked by political radicalization conducted by the US government after the attack of September 11, 2001. Secondly, because it offers a profuse gallery of villains who are outside the circle of friends as the new doctrine “either with us or against us” opened by George Bush for the XXI century. This gallery includes from the very terrorists who justify the War on Terror (Ra's al Ghul, the Joker), to the “radical left” (Bane, Talia al Ghul) including liberal politicians (Harvey Dent), and corrupt that take advantage of the softness of the law to commit crimes with impunity (Dr. Crane, the Scarecrow).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Key pre-distribution schemes have been proposed as means to overcome Wireless Sensor Networks constraints such as limited communication and processing power. Two sensor nodes can establish a secure link with some probability based on the information stored in their memories though it is not always possible that two sensor nodes may set up a secure link. In this paper, we propose a new approach that elects trusted common nodes called ”Proxies” which reside on an existing secure path linking two sensor nodes. These sensor nodes are used to send the generated key which will be divided into parts (nuggets) according to the number of elected proxies. Our approach has been assessed against previously developed algorithms and the results show that our algorithm discovers proxies more quickly which are closer to both end nodes, thus producing shorter path lengths. We have also assessed the impact of our algorithm on the average time to establish a secure link when the transmitter and receiver of the sensor nodes are ”ON”. The results show the superiority of our algorithm in this regard. Overall, the proposed algorithm is well suited for Wireless Sensor Networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solid low-level radioactive waste (LLW) is currently being disposed at a number of facilities in the United Kingdom (UK). The safety of these facilities relies to some extent on the use of engineered barriers, such as a cap, to isolate the waste and protect the environment. Generally, the material used as the barrier layer within such a cap should be of low permeability and it should retain this property over long timescales (beyond a few decades normally required for facilities containing non-radioactive wastes). The objective of this research is to determine the mineralogy of selected geological deposits from the UK and Ireland as part of a larger project to examine their suitability as a capping material, particularly on LLW sites. Mineral transformations, as a result of future climate change, may impact on the long-term performance of the cap and even the disposal facility. X-ray diffraction (XRD) was carried-out on the sand, silt and clay fractions of the London Clay, Belfast Upper Boulder Clay, Irish Glacial Till, Belfast Sleech, and Ampthill Clay geological deposits. Minerals were present that could pose both positive and negative effects on the long-term performance of the cap. Smectite, which has a high shrink swell potential, may produce cracks in London Clay, Belfast Upper Boulder Clay and Ampthill Clay capping material during dry, hotter periods as a possible consequence of future climate change; thus, resulting in higher permeability. Ampthill Clay and Belfast Sleech had elevated amounts of organic matter (OM) at 5.93% and 5.88%, respectively, which may also contribute to cracking. Over time, this OM may decompose and result in increased permeability. Gypsum (CaSO4) in the silt and sand fractions of Ampthill Clay may reduce the impact of erosion during wetter periods if it is incorporated into the upper portion of the cap. There are potential negative effects from the acidity created by the weathering of pyrite (FeS2) present in the silt and sand fractions of Belfast Sleech and Ampthill Clay that could impede the growth of grasses used to stabilize the surface of the capping material if this material is used as part of the vegetative soil layer. Additionally, acidic waters generated from pyrite weathering could negatively impact the lower lying capping layers and the disposal facility in general. However, the calcium carbonate (CaCO3) present in the silt and sand fractions of these deposits, and dolomite (CaMg(CO3)2) in Belfast Sleech, may counter act the acidity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article presents the attitudes of 80 teenagers-growing up in one of the most contested localities in Northern Ireland-to cross-community marriages, i.e. those between Catholics and Protestants. Research suggests that adults in interface areas continue to exhibit ethno-sectarian prejudices despite wider political developments such as the Good Friday Agreement. The teenagers perceived that their families would be largely unsupportive of cross-community unions but felt that their own views were much less prejudiced than those of their parents. However, while the majority of teenagers had no objections in principle to marrying outside their religious group, they outlined a number of practical difficulties which couples from cross-community unions would face. These included deciding where to live, in which religion, if any, to bring children up and where to send children to school. Most of the teenagers suggested that these potential problems would work against them marrying outside their own religious group. These practical dilemmas provide a more nuanced set of reasons for marrying within one's own community than dilemmas based on traditional prejudices and stereotypes and suggest that teenagers living in sectarian enclaves are more receptive to cross-community marriages than their parents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a dynamic verification approach for large-scale message passing programs to locate correctness bugs caused by unforeseen nondeterministic interactions. This approach hinges on an efficient protocol to track the causality between nondeterministic message receive operations and potentially matching send operations. We show that causality tracking protocols that rely solely on logical clocks fail to capture all nuances of MPI program behavior, including the variety of ways in which nonblocking calls can complete. Our approach is hinged on formally defining the matches-before relation underlying the MPI standard, and devising lazy update logical clock based algorithms that can correctly discover all potential outcomes of nondeterministic receives in practice. can achieve the same coverage as a vector clock based algorithm while maintaining good scalability. LLCP allows us to analyze realistic MPI programs involving a thousand MPI processes, incurring only modest overheads in terms of communication bandwidth, latency, and memory consumption. © 2011 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The world is changing. Advances in telecommunications have meant that the world is shrinking – data can be moved across continents in the time it takes to send an email or access the cloud. Although developments such as these highlight the extent of scientific and technological evolution, in terms of legal liability, questions must be asked as to the capacity of our legal structures to evolve accordingly.

This article looks at how emergency telephone provision and any shift to VoIP systems might fit with existing tort liability and associated duty implications. It does so by analysing the technology through the principles that signpost UK tort law. This article recognises that as an emerging area, the legal liability implications have not yet been discussed in any great detail. The aim of this article therefore is to introduce the area, encourage debate and consider the issues that may become increasingly relevant as these types of technologies become industrial standards.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Antibiotics are frequently prescribed for older adults who reside in long-term care facilities. A substantial proportion of antibiotic use in this setting is inappropriate. Antibiotics are often prescribed for asymptomatic bacteriuria, a condition for which randomized trials of antibiotic therapy indicate no benefit and in fact harm. This proposal describes a randomized trial of diagnostic and therapeutic algorithms to reduce the use of antibiotics in residents of long-term care facilities. METHODS: In this on-going study, 22 nursing homes have been randomized to either use of algorithms (11 nursing homes) or to usual practise (11 nursing homes). The algorithms describe signs and symptoms for which it would be appropriate to send urine cultures or to prescribe antibiotics. The algorithms are introduced by inservicing nursing staff and by conducting one-on-one sessions for physicians using case-scenarios. The primary outcome of the study is courses of antibiotics per 1000 resident days. Secondary outcomes include urine cultures sent and antibiotic courses for urinary indications. Focus groups and semi-structured interviews with key informants will be used to assess the process of implementation and to identify key factors for sustainability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We address the problem of designing distributed algorithms for large scale networks that are robust to Byzantine faults. We consider a message passing, full information model: the adversary is malicious, controls a constant fraction of processors, and can view all messages in a round before sending out its own messages for that round. Furthermore, each bad processor may send an unlimited number of messages. The only constraint on the adversary is that it must choose its corrupt processors at the start, without knowledge of the processors’ private random bits.

A good quorum is a set of O(logn) processors, which contains a majority of good processors. In this paper, we give a synchronous algorithm which uses polylogarithmic time and Õ(vn) bits of communication per processor to bring all processors to agreement on a collection of n good quorums, solving Byzantine agreement as well. The collection is balanced in that no processor is in more than O(logn) quorums. This yields the first solution to Byzantine agreement which is both scalable and load-balanced in the full information model.

The technique which involves going from situation where slightly more than 1/2 fraction of processors are good and and agree on a short string with a constant fraction of random bits to a situation where all good processors agree on n good quorums can be done in a fully asynchronous model as well, providing an approach for extending the Byzantine agreement result to this model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:



We consider the problem of self-healing in peer-to-peer networks that are under repeated attack by an omniscient adversary. We assume that the following process continues for up to n rounds where n is the total number of nodes initially in the network: the adversary deletesan arbitrary node from the network, then the network responds by quickly adding a small number of new edges.

We present a distributed data structure that ensures two key properties. First, the diameter of the network is never more than O(log Delta) times its original diameter, where Delta is the maximum degree of the network initially. We note that for many peer-to-peer systems, Delta is polylogarithmic, so the diameter increase would be a O(loglog n) multiplicative factor. Second, the degree of any node never increases by more than 3 over its original degree. Our data structure is fully distributed, has O(1) latency per round and requires each node to send and receive O(1) messages per round. The data structure requires an initial setup phase that has latency equal to the diameter of the original network, and requires, with high probability, each node v to send O(log n) messages along every edge incident to v. Our approach is orthogonal and complementary to traditional topology-based approaches to defending against attack.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We address the distribution of quantum information among many parties in the presence of noise. In particular, we consider how to optimally send to m receivers the information encoded into an unknown coherent state. On one hand, a local strategy is considered, consisting in a local cloning process followed by direct transmission. On the other hand, a telecloning protocol based on nonlocal quantum correlations is analysed. Both the strategies are optimized to minimize the detrimental effects due to losses and thermal noise during the propagation. The comparison between the local and the nonlocal protocol shows that telecloning is more effective than local cloning for a wide range of noise parameters. Our results indicate that nonlocal strategies can be more robust against noise than local ones, thus being suitable candidates for playing a major role in quantum information networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Schools attempting to engage with the families of all learners, including those with culturally and linguistically diverse backgrounds recognize the importance of effective oral and written communication. The aim of this study is to determine if school generated written communication created by an urban school district serving a culturally and linguistically diverse population in the Northeast of the US adhered to the principles of plain English. This exploratory research examined exemplar pieces of written school generated communication, using different forms of linguistic analysis to determine whether the communication contained elements recognized to facilitate or impede the comprehensibility of each piece of communication. Additionally, a text assessment tool which can help schools to analyze the written text communication they send to families was developed and refined.