972 resultados para Forensics computer science


Relevância:

90.00% 90.00%

Publicador:

Resumo:

"April 1, 1969."

Relevância:

90.00% 90.00%

Publicador:

Resumo:

"This work was supported by Contract AT(11-1)-1018 with the U.S. Atomic Energy Commission through September 30, 1970. Current support is under Contract AV(11-1)-2118 with the above agency."

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: The aim of this project was to design and evaluate a system that would produce tailored information for stroke patients and their carers, customised according to their informational needs, and facilitate communication between the patient and, health professional. Method: A human factors development approach was used to develop a computer system, which dynamically compiles stroke education booklets for patients and carers. Patients and carers are able to select the topics about which they wish to receive information, the amount of information they want, and the font size of the printed booklet. The system is designed so that the health professional interacts with it, thereby providing opportunities for communication between the health professional and patient/carer at a number of points in time. Results: Preliminary evaluation of the system by health professionals, patients and carers was positive. A randomised controlled trial that examines the effect of the system on patient and carer outcomes is underway. (C) 2004 Elsevier Ireland Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present the prototype tool CADS* for the computer-aided development of an important class of self-* systems, namely systems whose components can be modelled as Markov chains. Given a Markov chain representation of the IT components to be included into a self-* system, CADS* automates or aids (a) the development of the artifacts necessary to build the self-* system; and (b) their integration into a fully-operational self-* solution. This is achieved through a combination of formal software development techniques including model transformation, model-driven code generation and dynamic software reconfiguration.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A szerzők cikkükben a számítástechnikai hulladékokkal foglalkoznak, számítástechnikai eszközök alatt a számítógép konfigurációk összetevőit értik, tehát számítógépeket (asztali, hordozható, terminál stb.), és perifériáit (monitor, nyomtató, cd-író stb.), valamint ezek alkatrészeit és kiegészítőit (chipek, mechanikus részek, festékkazetták stb.). A rendszeres használat környezeti hatásait csak abból a szempontból vizsgálták, hogy ennek során bizonyos alkatrészek, kellékek (kiemelten a nyomtatók festékkazettái) a gépnél nagyobb gyakorisággal cserélődnek, s válhatnak hulladékká. A fő fókusz a számítástechnikai eszközök élettartamának vége, s ebből a szempontból kulcsfogalom a használt személyi számítógép kategória. _____ In their article, the authors discuss the issue of computer waste; under the category of information technology devices they understand the components of computer configurations, that is computers (desktop, portable, terminal etc.) and their peripheries (monitor, printer, CD writer, etc), and also the components and supplements of these (chips, mechanical parts, toner cartridges, etc.). The environmental impact of regular use was examined only from one aspect: during regular use certain components and accessories (especially the toner cartridges of printers) are more often changed and become waste. The main focus is the end of the life time of computer devices, and from this point of view used personal computers are a key concept.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The need to provide computers with the ability to distinguish the affective state of their users is a major requirement for the practical implementation of affective computing concepts. This dissertation proposes the application of signal processing methods on physiological signals to extract from them features that can be processed by learning pattern recognition systems to provide cues about a person's affective state. In particular, combining physiological information sensed from a user's left hand in a non-invasive way with the pupil diameter information from an eye-tracking system may provide a computer with an awareness of its user's affective responses in the course of human-computer interactions. In this study an integrated hardware-software setup was developed to achieve automatic assessment of the affective status of a computer user. A computer-based "Paced Stroop Test" was designed as a stimulus to elicit emotional stress in the subject during the experiment. Four signals: the Galvanic Skin Response (GSR), the Blood Volume Pulse (BVP), the Skin Temperature (ST) and the Pupil Diameter (PD), were monitored and analyzed to differentiate affective states in the user. Several signal processing techniques were applied on the collected signals to extract their most relevant features. These features were analyzed with learning classification systems, to accomplish the affective state identification. Three learning algorithms: Naïve Bayes, Decision Tree and Support Vector Machine were applied to this identification process and their levels of classification accuracy were compared. The results achieved indicate that the physiological signals monitored do, in fact, have a strong correlation with the changes in the emotional states of the experimental subjects. These results also revealed that the inclusion of pupil diameter information significantly improved the performance of the emotion recognition system. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Developing analytical models that can accurately describe behaviors of Internet-scale networks is difficult. This is due, in part, to the heterogeneous structure, immense size and rapidly changing properties of today's networks. The lack of analytical models makes large-scale network simulation an indispensable tool for studying immense networks. However, large-scale network simulation has not been commonly used to study networks of Internet-scale. This can be attributed to three factors: 1) current large-scale network simulators are geared towards simulation research and not network research, 2) the memory required to execute an Internet-scale model is exorbitant, and 3) large-scale network models are difficult to validate. This dissertation tackles each of these problems. ^ First, this work presents a method for automatically enabling real-time interaction, monitoring, and control of large-scale network models. Network researchers need tools that allow them to focus on creating realistic models and conducting experiments. However, this should not increase the complexity of developing a large-scale network simulator. This work presents a systematic approach to separating the concerns of running large-scale network models on parallel computers and the user facing concerns of configuring and interacting with large-scale network models. ^ Second, this work deals with reducing memory consumption of network models. As network models become larger, so does the amount of memory needed to simulate them. This work presents a comprehensive approach to exploiting structural duplications in network models to dramatically reduce the memory required to execute large-scale network experiments. ^ Lastly, this work addresses the issue of validating large-scale simulations by integrating real protocols and applications into the simulation. With an emulation extension, a network simulator operating in real-time can run together with real-world distributed applications and services. As such, real-time network simulation not only alleviates the burden of developing separate models for applications in simulation, but as real systems are included in the network model, it also increases the confidence level of network simulation. This work presents a scalable and flexible framework to integrate real-world applications with real-time simulation.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work explores the use of statistical methods in describing and estimating camera poses, as well as the information feedback loop between camera pose and object detection. Surging development in robotics and computer vision has pushed the need for algorithms that infer, understand, and utilize information about the position and orientation of the sensor platforms when observing and/or interacting with their environment.

The first contribution of this thesis is the development of a set of statistical tools for representing and estimating the uncertainty in object poses. A distribution for representing the joint uncertainty over multiple object positions and orientations is described, called the mirrored normal-Bingham distribution. This distribution generalizes both the normal distribution in Euclidean space, and the Bingham distribution on the unit hypersphere. It is shown to inherit many of the convenient properties of these special cases: it is the maximum-entropy distribution with fixed second moment, and there is a generalized Laplace approximation whose result is the mirrored normal-Bingham distribution. This distribution and approximation method are demonstrated by deriving the analytical approximation to the wrapped-normal distribution. Further, it is shown how these tools can be used to represent the uncertainty in the result of a bundle adjustment problem.

Another application of these methods is illustrated as part of a novel camera pose estimation algorithm based on object detections. The autocalibration task is formulated as a bundle adjustment problem using prior distributions over the 3D points to enforce the objects' structure and their relationship with the scene geometry. This framework is very flexible and enables the use of off-the-shelf computational tools to solve specialized autocalibration problems. Its performance is evaluated using a pedestrian detector to provide head and foot location observations, and it proves much faster and potentially more accurate than existing methods.

Finally, the information feedback loop between object detection and camera pose estimation is closed by utilizing camera pose information to improve object detection in scenarios with significant perspective warping. Methods are presented that allow the inverse perspective mapping traditionally applied to images to be applied instead to features computed from those images. For the special case of HOG-like features, which are used by many modern object detection systems, these methods are shown to provide substantial performance benefits over unadapted detectors while achieving real-time frame rates, orders of magnitude faster than comparable image warping methods.

The statistical tools and algorithms presented here are especially promising for mobile cameras, providing the ability to autocalibrate and adapt to the camera pose in real time. In addition, these methods have wide-ranging potential applications in diverse areas of computer vision, robotics, and imaging.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ecosystem engineers that increase habitat complexity are keystone species in marine systems, increasing shelter and niche availability, and therefore biodiversity. For example, kelp holdfasts form intricate structures and host the largest number of organisms in kelp ecosystems. However, methods that quantify 3D habitat complexity have only seldom been used in marine habitats, and never in kelp holdfast communities. This study investigated the role of kelp holdfasts (Laminaria hyperborea) in supporting benthic faunal biodiversity. Computer-aided tomography (CT-) scanning was used to quantify the three-dimensional geometrical complexity of holdfasts, including volume, surface area and surface fractal dimension (FD). Additionally, the number of haptera, number of haptera per unit of volume, and age of kelps were estimated. These measurements were compared to faunal biodiversity and community structure, using partial least-squares regression and multivariate ordination. Holdfast volume explained most of the variance observed in biodiversity indices, however all other complexity measures also strongly contributed to the variance observed. Multivariate ordinations further revealed that surface area and haptera per unit of volume accounted for the patterns observed in faunal community structure. Using 3D image analysis, this study makes a strong contribution to elucidate quantitative mechanisms underlying the observed relationship between biodiversity and habitat complexity. Furthermore, the potential of CT-scanning as an ecological tool is demonstrated, and a methodology for its use in future similar studies is established. Such spatially resolved imager analysis could help identify structurally complex areas as biodiversity hotspots, and may support the prioritization of areas for conservation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ecosystem engineers that increase habitat complexity are keystone species in marine systems, increasing shelter and niche availability, and therefore biodiversity. For example, kelp holdfasts form intricate structures and host the largest number of organisms in kelp ecosystems. However, methods that quantify 3D habitat complexity have only seldom been used in marine habitats, and never in kelp holdfast communities. This study investigated the role of kelp holdfasts (Laminaria hyperborea) in supporting benthic faunal biodiversity. Computer-aided tomography (CT-) scanning was used to quantify the three-dimensional geometrical complexity of holdfasts, including volume, surface area and surface fractal dimension (FD). Additionally, the number of haptera, number of haptera per unit of volume, and age of kelps were estimated. These measurements were compared to faunal biodiversity and community structure, using partial least-squares regression and multivariate ordination. Holdfast volume explained most of the variance observed in biodiversity indices, however all other complexity measures also strongly contributed to the variance observed. Multivariate ordinations further revealed that surface area and haptera per unit of volume accounted for the patterns observed in faunal community structure. Using 3D image analysis, this study makes a strong contribution to elucidate quantitative mechanisms underlying the observed relationship between biodiversity and habitat complexity. Furthermore, the potential of CT-scanning as an ecological tool is demonstrated, and a methodology for its use in future similar studies is established. Such spatially resolved imager analysis could help identify structurally complex areas as biodiversity hotspots, and may support the prioritization of areas for conservation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Stealthy attackers move patiently through computer networks - taking days, weeks or months to accomplish their objectives in order to avoid detection. As networks scale up in size and speed, monitoring for such attack attempts is increasingly a challenge. This paper presents an efficient monitoring technique for stealthy attacks. It investigates the feasibility of proposed method under number of different test cases and examines how design of the network affects the detection. A methodological way for tracing anonymous stealthy activities to their approximate sources is also presented. The Bayesian fusion along with traffic sampling is employed as a data reduction method. The proposed method has the ability to monitor stealthy activities using 10-20% size sampling rates without degrading the quality of detection.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The development of new learning models has been of great importance throughout recent years, with a focus on creating advances in the area of deep learning. Deep learning was first noted in 2006, and has since become a major area of research in a number of disciplines. This paper will delve into the area of deep learning to present its current limitations and provide a new idea for a fully integrated deep and dynamic probabilistic system. The new model will be applicable to a vast number of areas initially focusing on applications into medical image analysis with an overall goal of utilising this approach for prediction purposes in computer based medical systems.