97 resultados para security measurement
Resumo:
The JXTA-Overlay project is an effort to use JXTA technologyto provide a generic set of functionalities that can be used by developers to deploy P2P applications. Since its design mainly focuses on issues such as scalability or overall performance, it does not take security into account. However, as P2P applications have evolved to fulfill more complex scenarios, security has become a very important aspect to take into account when evaluating a P2P framework. This work proposes a security extension specifically suited to JXTA-Overlay¿s idiosyncrasies, providing an acceptable solution to some of its current shortcomings.
Resumo:
Sensor networks have many applications in monitoring and controlling of environmental properties such as sound, acceleration, vibration and temperature. Due to limitedresources in computation capability, memory and energy, they are vulnerable to many kinds of attacks. The ZigBee specification based on the 802.15.4 standard, defines a set of layers specifically suited to sensor networks. These layers support secure messaging using symmetric cryptographic. This paper presents two different ways for grabbing the cryptographic key in ZigBee: remote attack and physical attack. It also surveys and categorizes some additional attacks which can be performed on ZigBee networks: eavesdropping, spoofing, replay and DoS attacks at different layers. From this analysis, it is shown that some vulnerabilities still in the existing security schema in ZigBee technology.
Resumo:
En l'actualitat, la maduresa del camp de la investigació P2P empès a través de nous problemes, relacionats amb la seguretat. Per aquesta raó, la seguretat comença a convertir-se en una de les qüestions clau en l'avaluació d'un sistema P2P, i és important proporcionar mecanismes de seguretat per a sistemes P2P. El projecte JXTAOverlay fa un esforç per utilitzar la tecnologia JXTA per proporcionar un conjunt genèric de funcions que poden ser utilitzades pels desenvolupadors per desplegar aplicacions P2P. No obstant això, encara que el seu disseny es va centrar en qüestions com ara l'escalabilitat o el rendiment general, no va tenir en compte la seguretat. Aquest treball proposa un marc de seguretat, adaptat específicament a la idiosincràsia del JXTAOverlay.
Resumo:
JXME is the JXTA protocols implementation formobile devices using J2ME. Two different flavors of JXME have been implemented, each one specific for a particular set of devices, according to their capabilities. The main value of JXME is its simplicity to create peer-to-peer (P2P) applications in limited devices. In addition to assessing JXME functionalities, it is also important to realize the default security level provided. This paper presents a brief analysis of the current state of security in JXME, focusing on the JXME-Proxied version, identifies existing vulnerabilities and proposes further improvements in this field.
Resumo:
The use of open source software continues to grow on a daily basis. Today, enterprise applications contain 40% to 70% open source code and this fact has legal, development, IT security, risk management and compliance organizations focusing their attention on its use, as never before. They increasingly understand that the open source content within an application must be detected. Once uncovered, decisions regarding compliance with intellectual property licensing obligations must be made and known security vulnerabilities must be remediated. It is no longer sufficient from a risk perspective to not address both open source issues.
Resumo:
Aquest projecte inclou una aproximació als conceptes de RFID i targetes contactless centrant-se en l’ampliament usat MIFARE Classic chip. L’objectiu principal es mostrar el seu funcionament i les seves vulnerabilitats, així com alguns exemples pràctics fent una anàlisi de diferents serveis que les utilitzen.
Resumo:
In this work, a LIDAR-based 3D Dynamic Measurement System is presented and evaluated for the geometric characterization of tree crops. Using this measurement system, trees were scanned from two opposing sides to obtain two three-dimensional point clouds. After registration of the point clouds, a simple and easily obtainable parameter is the number of impacts received by the scanned vegetation. The work in this study is based on the hypothesis of the existence of a linear relationship between the number of impacts of the LIDAR sensor laser beam on the vegetation and the tree leaf area. Tests performed under laboratory conditions using an ornamental tree and, subsequently, in a pear tree orchard demonstrate the correct operation of the measurement system presented in this paper. The results from both the laboratory and field tests confirm the initial hypothesis and the 3D Dynamic Measurement System is validated in field operation. This opens the door to new lines of research centred on the geometric characterization of tree crops in the field of agriculture and, more specifically, in precision fruit growing.
Resumo:
A network of twenty stakes was set up on Johnsons Glacier in order to determine its dynamics. During the austral summers from 1994-95 to 1997-98, we estimated surface velocities, mass balances and ice thickness variations. Horizontal velocity increased dow nstream from 1 m a- 1 near the ice divides to 40 m a- 1 near the ice terminus. The accumulation zone showed low accumulation rates (maximum of 0,6 m a- 1 (ice)), whereas in the lower part of the glacier, ablation rates were 4,3 m a- 1 (ice). Over the 3-year study period, both in the accumulation and ablation zones, we detected a reduction in the ice surface level ranging from 2 to 10 m from the annual ve rt ical velocities and ice-thinning data, the mass balance was obtained and compared with the mass balance field values, resulting in similar estimates. Flux values were calculated using cross-section data and horizontal velocities, and compared with the results obtained by means of mass balance and ice thinning data using the continuity equation. The two methods gave similar results.
Resumo:
Abstract Purpose: Several well-known managerial accounting performance measurement models rely on causal assumptions. Whilst users of the models express satisfaction and link them with improved organizational performance, academic research, of the realworld applications, shows few reliable statistical associations. This paper provides a discussion on the"problematic" of causality in a performance measurement setting. Design/methodology/approach: This is a conceptual study based on an analysis and synthesis of the literature from managerial accounting, organizational theory, strategic management and social scientific causal modelling. Findings: The analysis indicates that dynamic, complex and uncertain environments may challenge any reliance upon valid causal models. Due to cognitive limitations and judgmental biases, managers may fail to trace correct cause-and-effect understanding of the value creation in their organizations. However, even lacking this validity, causal models can support strategic learning and perform as organizational guides if they are able to mobilize managerial action. Research limitations/implications: Future research should highlight the characteristics necessary for elaboration of convincing and appealing causal models and the social process of their construction. Practical implications: Managers of organizations using causal models should be clear on the purposes of their particular models and their limitations. In particular, difficulties are observed in specifying detailed cause and effect relations and their potential for communicating and directing attention. They should therefore construct their models to suit the particular purpose envisaged. Originality/value: This paper provides an interdisciplinary and holistic view on the issue of causality in managerial accounting models.
Resumo:
We present experiments in which the laterally confined flow of a surfactant film driven by controlled surface tension gradients causes the subtended liquid layer to self-organize into an inner upstream microduct surrounded by the downstream flow. The anomalous interfacial flow profiles and the concomitant backflow are a result of the feedback between two-dimensional and three-dimensional microfluidics realized during flow in open microchannels. Bulk and surface particle image velocimetry data combined with an interfacial hydrodynamics model explain the dependence of the observed phenomena on channel geometry.
Resumo:
Background In an agreement assay, it is of interest to evaluate the degree of agreement between the different methods (devices, instruments or observers) used to measure the same characteristic. We propose in this study a technical simplification for inference about the total deviation index (TDI) estimate to assess agreement between two devices of normally-distributed measurements and describe its utility to evaluate inter- and intra-rater agreement if more than one reading per subject is available for each device. Methods We propose to estimate the TDI by constructing a probability interval of the difference in paired measurements between devices, and thereafter, we derive a tolerance interval (TI) procedure as a natural way to make inferences about probability limit estimates. We also describe how the proposed method can be used to compute bounds of the coverage probability. Results The approach is illustrated in a real case example where the agreement between two instruments, a handle mercury sphygmomanometer device and an OMRON 711 automatic device, is assessed in a sample of 384 subjects where measures of systolic blood pressure were taken twice by each device. A simulation study procedure is implemented to evaluate and compare the accuracy of the approach to two already established methods, showing that the TI approximation produces accurate empirical confidence levels which are reasonably close to the nominal confidence level. Conclusions The method proposed is straightforward since the TDI estimate is derived directly from a probability interval of a normally-distributed variable in its original scale, without further transformations. Thereafter, a natural way of making inferences about this estimate is to derive the appropriate TI. Constructions of TI based on normal populations are implemented in most standard statistical packages, thus making it simpler for any practitioner to implement our proposal to assess agreement.
Resumo:
This paper reviews the concept of presence in immersive virtual environments, the sense of being there signalled by people acting and responding realistically to virtual situations and events. We argue that presence is a unique phenomenon that must be distinguished from the degree of engagement, involvement in the portrayed environment. We argue that there are three necessary conditions for presence: the (a) consistent low latency sensorimotor loop between sensory data and proprioception; (b) statistical plausibility: images must be statistically plausible in relation to the probability distribution of images over natural scenes. A constraint on this plausibility is the level of immersion;(c) behaviour-response correlations: Presence may be enhanced and maintained over time by appropriate correlations between the state and behaviour of participants and responses within the environment, correlations that show appropriate responses to the activity of the participants. We conclude with a discussion of methods for assessing whether presence occurs, and in particular recommend the approach of comparison with ground truth and give some examples of this.