902 resultados para Virtual Private Network (VPN), Rete privata
Resumo:
My thesis concerns the notion of existence as an encounter, as developed in the philosophy of Gilles Deleuze (1925 1995). What this denotes is a critical stance towards a major current in Western philosophical tradition which Deleuze nominates as representational thinking. Such thinking strives to provide a stable ground for identities by appealing to transcendent structures behind the apparent reality and explaining the manifest diversity of the given by such notions as essence, idea, God, or totality of the world. In contrast to this, Deleuze states that abstractions such as these do not explain anything, but rather that they need to be explained. Yet, Deleuze does not appeal merely to the given. He sees that one must posit a genetic element that accounts for experience, and this element must not be naïvely traced from the empirical. Deleuze nominates his philosophy as transcendental empiricism and he seeks to bring together the approaches of both empiricism and transcendental philosophy. In chapter one I look into the motivations of Deleuze s transcendental empiricism and analyse it as an encounter between Deleuze s readings of David Hume and Immanuel Kant. This encounter regards, first of all, the question of subjectivity and results in a conception of identity as non-essential process. A pre-given concept of identity does not explain the nature of things, but the concept itself must be explained. From this point of view, the process of individualisation must become the central concern. In chapter two I discuss Deleuze s concept of the affect as the basis of identity and his affiliation with the theories of Gilbert Simondon and Jakob von Uexküll. From this basis develops a morphogenetic theory of individuation-as-process. In analysing such a process of individuation, the modal category of the virtual becomes of great value, being an open, indeterminate charge of potentiality. As the virtual concerns becoming or the continuous process of actualisation, then time, rather than space, will be the privileged field of consideration. Chapter three is devoted to the discussion of the temporal aspect of the virtual and difference-without-identity. The essentially temporal process of subjectification results in a conception of the subject as composition: an assemblage of heterogeneous elements. Therefore art and aesthetic experience is valued by Deleuze because they disclose the construct-like nature of subjectivity in the sensations they produce. Through the domain of the aesthetic the subject is immersed in the network of affectivity that is the material diversity of the world. Chapter four addresses a phenomenon displaying this diversified indentity: the simulacrum an identity that is not grounded in an essence. Developed on the basis of the simulacrum, a theory of identity as assemblage emerges in chapter five. As the problematic of simulacra concerns perhaps foremost the artistic presentation, I shall look into the identity of a work of art as assemblage. To take an example of a concrete artistic practice and to remain within the problematic of the simulacrum, I shall finally address the question of reproduction particularly in the case recorded music and its identity regarding the work of art. In conclusion, I propose that by overturning its initial representational schema, phonographic music addresses its own medium and turns it into an inscription of difference, exposing the listener to an encounter with the virtual.
Resumo:
In developing countries high rate of growth in demand of electric energy is felt, and so the addition of new generating units becomes necessary. In deregulated power systems private generating stations are encouraged to add new generations. Finding the appropriate location of new generator to be installed can be obtained by running repeated power flows, carrying system studies like analyzing the voltage profile, voltage stability, loss analysis etc. In this paper a new methodology is proposed which will mainly consider the existing network topology into account. A concept of T-index is introduced in this paper, which considers the electrical distances between generator and load nodes.This index is used for ranking significant new generation expansion locations and also indicates the amount of permissible generations that can be installed at these new locations. This concept facilitates for the medium and long term planning of power generation expansions within the available transmission corridors. Studies carried out on a sample 7-bus system, EHV equivalent 24-bus system and IEEE 39 bus system are presented for illustration purpose.
Resumo:
The aim of the present study was to investigate the challenges that relate to the implementation of virtual inquiry practises in middle school. The case was a school course in which a group of Finnish students (N = 14) and teachers (N = 7) completed group inquiries through virtual collaboration, using a web-based learning environment. The task was to accomplish a cross-disciplinary inquiry into cultural issues. The students worked mainly at home and took much responsibility for their course achievements. The investigators analysed the pedagogical design of the course and the content of the participants' interaction patterns in the web-based environment, using qualitative content analysis and social network analysis. The findings suggest that the students succeeded in producing distinctive cultural products, and both the students and the teachers adopted novel roles during the inquiry. The web-based learning environment was used more as a coordination tool for organizing the collaborative work than as a forum for epistemic inquiry. The tension between the school curriculum and the inquiry practises was manifest in the participants' discussions of the assessment criteria of the course.
Resumo:
The prevalent virtualization technologies provide QoS support within the software layers of the virtual machine monitor(VMM) or the operating system of the virtual machine(VM). The QoS features are mostly provided as extensions to the existing software used for accessing the I/O device because of which the applications sharing the I/O device experience loss of performance due to crosstalk effects or usable bandwidth. In this paper we examine the NIC sharing effects across VMs on a Xen virtualized server and present an alternate paradigm that improves the shared bandwidth and reduces the crosstalk effect on the VMs. We implement the proposed hardwaresoftware changes in a layered queuing network (LQN) model and use simulation techniques to evaluate the architecture. We find that simple changes in the device architecture and associated system software lead to application throughput improvement of up to 60%. The architecture also enables finer QoS controls at device level and increases the scalability of device sharing across multiple virtual machines. We find that the performance improvement derived using LQN model is comparable to that reported by similar but real implementations.
Resumo:
The key requirements for enabling real-time remote healthcare service on a mobile platform, in the present day heterogeneous wireless access network environment, are uninterrupted and continuous access to the online patient vital medical data, monitor the physical condition of the patient through video streaming, and so on. For an application, this continuity has to be sufficiently transparent both from a performance perspective as well as a Quality of Experience (QoE) perspective. While mobility protocols (MIPv6, HIP, SCTP, DSMIP, PMIP, and SIP) strive to provide both and do so, limited or non-availability (deployment) of these protocols on provider networks and server side infrastructure has impeded adoption of mobility on end user platforms. Add to this, the cumbersome OS configuration procedures required to enable mobility protocol support on end user devices and the user's enthusiasm to add this support is lost. Considering the lack of proper mobility implementations that meet the remote healthcare requirements above, we propose SeaMo+ that comprises a light-weight application layer framework, termed as the Virtual Real-time Multimedia Service (VRMS) for mobile devices to provide an uninterrupted real-time multimedia information access to the mobile user. VRMS is easy to configure, platform independent, and does not require additional network infrastructure unlike other existing schemes. We illustrate the working of SeaMo+ in two realistic remote patient monitoring application scenarios.
Resumo:
Rapid diagnostics and virtual imaging of damages in complex structures like folded plate can help reduce the inspection time for guided wave based NDE and integrated SHM. Folded plate or box structure is one of the major structural components for increasing the structural strength. Damage in the folded plate, mostly in the form of surface breaking cracks in the inaccessible zone is a usual problem in aerospace structures. One side of the folded plate is attached (either riveted or bonded) to adjacent structure which is not accessible for immediate inspection. The sensor-actuator network in the form of a circular array is placed on the accessible side of the folded plate. In the present work, a circular array is employed for scanning the entire folded plate type structure for damage diagnosis and wave field visualization of entire structural panel. The method employs guided wave with relatively low frequency bandwidth of 100-300 kHz. Change in the response signal with respect to a baseline signal is used to construct a quantitative relationship with damage size parameters. Detecting damage in the folded plate by using this technique has significant potential for off-line and on-line SHM technologies. By employing this technique, surface breaking cracks on inaccessible face of the folded plate are detected without disassembly of structure in a realistic environment.
Resumo:
Birkenhead Sixth Form College implemented a virtual network to open up remote access to the college network for its students, staff and governors. In particular, for childcare students on work placements, this has meant 24/7 secure access to their work and resources, and the ability to make timely updates to their work evidence logs. The impact is better continuity of learning and a dramatic increase in the hand-in rate for work. For the staff, governors and college as a whole, the benefits of anytime-access to the network are more than were envisaged at the outset; not only is it saving them valuable time and eliminating the need for large print runs, it is expected to bring cost-savings to the College in the long term.
Resumo:
This document briefs network managers on Janet’s current position as regards VPN, the different flavours of VPN available, and the current position of VPN on other networks around the globe. The appendix is a technical supplement that provides background information about VPN-enabling technologies. JL
Resumo:
The South Carolina Coastal Information Network (SCCIN) emerged as a result of a number of coastal outreach institutions working in partnership to enhance coordination of the coastal community outreach efforts in South Carolina. This organized effort, led by the S.C. Sea Grant Consortium and its Extension Program, includes partners from federal and state agencies, regional government agencies, and private organizations seeking to coordinate and/or jointly deliver outreach programs that target coastal community constituents. The Network was officially formed in 2006 with the original intention of fostering intra-and inter- agency communication, coordination, and cooperation. Network partners include the S.C. Sea Grant Consortium, S.C. Department of Health and Environmental Control – Office of Ocean and Coastal Resource Management and Bureau of Water, S.C. Department of Natural Resources – ACE Basin National Estuarine Research Reserve, North Inlet-Winyah Bay National Estuarine Research Reserve, Clemson University Cooperative Extension Service and Carolina Clear, Berkeley-Charleston-Dorchester Council of Governments, Waccamaw Regional Council of Governments, Urban Land Institute of South Carolina, S.C. Department of Archives and History, the National Oceanic and Atmospheric Administration – Coastal Services Center and Hollings Marine Laboratory, Michaux Conservancy, Ashley-Cooper Stormwater Education Consortium, the Coastal Waccamaw Stormwater Education Consortium, the S.C. Chapter of the U.S. Green Building Council, and the Lowcountry Council of Governments. (PDF contains 3 pages)
Resumo:
Grinding is an advanced machining process for the manufacturing of valuable complex and accurate parts for high added value sectors such as aerospace, wind generation, etc. Due to the extremely severe conditions inside grinding machines, critical process variables such as part surface finish or grinding wheel wear cannot be easily and cheaply measured on-line. In this paper a virtual sensor for on-line monitoring of those variables is presented. The sensor is based on the modelling ability of Artificial Neural Networks (ANNs) for stochastic and non-linear processes such as grinding; the selected architecture is the Layer-Recurrent neural network. The sensor makes use of the relation between the variables to be measured and power consumption in the wheel spindle, which can be easily measured. A sensor calibration methodology is presented, and the levels of error that can be expected are discussed. Validation of the new sensor is carried out by comparing the sensor's results with actual measurements carried out in an industrial grinding machine. Results show excellent estimation performance for both wheel wear and surface roughness. In the case of wheel wear, the absolute error is within the range of microns (average value 32 mu m). In the case of surface finish, the absolute error is well below R-a 1 mu m (average value 0.32 mu m). The present approach can be easily generalized to other grinding operations.
Resumo:
The study of exchange markets dates back to LeonWalras's general equilibrium theory. Since then the economic market has been studied for its' equilibrium properties, fairness of allocations of private and public goods, and even the psychological incentives of participants. This paper studies the dynamics of an exchange economy built on a network of markets where consumers trade with suppliers to optimize utility. Viewing the market in as a decentralized network we study the system from the usual control theory point of view, evaluating the system's dynamic performance, stability and robustness. It is shown that certain consumer demand dynamics can lead to oscillations while others can converge to optimal allocations. © 2011 IFAC.
Resumo:
This paper applies data coding thought, which based on the virtual information source modeling put forward by the author, to propose the image coding (compression) scheme based on neural network and SVM. This scheme is composed by "the image coding (compression) scheme based oil SVM" embedded "the lossless data compression scheme based oil neural network". The experiments show that the scheme has high compression ratio under the slightly damages condition, partly solve the contradiction which 'high fidelity' and 'high compression ratio' cannot unify in image coding system.
Resumo:
First, the compression-awaited data are regarded Lis character strings which are produced by virtual information source mapping M. then the model of the virtual information source M is established by neural network and SVM. Last we construct a lossless data compression (coding) scheme based oil neural network and SVM with the model, an integer function and a SVM discriminant. The scheme differs from the old entropy coding (compressions) inwardly, and it can compress some data compressed by the old entropy coding.
Resumo:
Emerging configurable infrastructures such as large-scale overlays and grids, distributed testbeds, and sensor networks comprise diverse sets of available computing resources (e.g., CPU and OS capabilities and memory constraints) and network conditions (e.g., link delay, bandwidth, loss rate, and jitter) whose characteristics are both complex and time-varying. At the same time, distributed applications to be deployed on these infrastructures exhibit increasingly complex constraints and requirements on resources they wish to utilize. Examples include selecting nodes and links to schedule an overlay multicast file transfer across the Grid, or embedding a network experiment with specific resource constraints in a distributed testbed such as PlanetLab. Thus, a common problem facing the efficient deployment of distributed applications on these infrastructures is that of "mapping" application-level requirements onto the network in such a manner that the requirements of the application are realized, assuming that the underlying characteristics of the network are known. We refer to this problem as the network embedding problem. In this paper, we propose a new approach to tackle this combinatorially-hard problem. Thanks to a number of heuristics, our approach greatly improves performance and scalability over previously existing techniques. It does so by pruning large portions of the search space without overlooking any valid embedding. We present a construction that allows a compact representation of candidate embeddings, which is maintained by carefully controlling the order via which candidate mappings are inserted and invalid mappings are removed. We present an implementation of our proposed technique, which we call NETEMBED – a service that identify feasible mappings of a virtual network configuration (the query network) to an existing real infrastructure or testbed (the hosting network). We present results of extensive performance evaluation experiments of NETEMBED using several combinations of real and synthetic network topologies. Our results show that our NETEMBED service is quite effective in identifying one (or all) possible embeddings for quite sizable queries and hosting networks – much larger than what any of the existing techniques or services are able to handle.