968 resultados para anonimato rete privacy deep web onion routing cookie


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetometer data, acquired on spacecraft and simultaneously at high and low latitudes on the ground, are compared in order to study the propagation characteristics of hydromagnetic energy deep into the magnetosphere. Single events provide evidence that wave energy at L ∼ 3 can at times be only one order of magnitude lower than at L ∼ 13. In addition, statistical analyses of the H-component groundbased data obtained during local daytime hours of 17 July-3 August 1985 show that wave amplitudes at L ∼ 3 are generally 10-30 times lower than at L ∼ 13. The L-dependence of near-equator magnetic field fluctuations measured on ISEE-2 show a sharp drop in energy near the magnetopause and a more gradual fall-off of energy deeper inside the magnetosphere. Such high levels of wave power deep in the magnetosphere have not been quantitatively understood previously. Our initial attempt is to calculate the decay length of an evanescent wave generated at a thick magnetopause boundary. Numerical calculations show that fast magnetosonic modes (called magnetopause and inner mode) can be generated under very restrictive conditions for the field and plasma parameters. These fast compressional modes may have their energy reduced by only one order of magnitude over a penetration depth of about 8RE. More realistic numerical simulations need to be carried out to see whether better agreement with the data can be attained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Encoding protein 3D structures into 1D string using short structural prototypes or structural alphabets opens a new front for structure comparison and analysis. Using the well-documented 16 motifs of Protein Blocks (PBs) as structural alphabet, we have developed a methodology to compare protein structures that are encoded as sequences of PBs by aligning them using dynamic programming which uses a substitution matrix for PBs. This methodology is implemented in the applications available in Protein Block Expert (PBE) server. PBE addresses common issues in the field of protein structure analysis such as comparison of proteins structures and identification of protein structures in structural databanks that resemble a given structure. PBE-T provides facility to transform any PDB file into sequences of PBs. PBE-ALIGNc performs comparison of two protein structures based on the alignment of their corresponding PB sequences. PBE-ALIGNm is a facility for mining SCOP database for similar structures based on the alignment of PBs. Besides, PBE provides an interface to a database (PBE-SAdb) of preprocessed PB sequences from SCOP culled at 95% and of all-against-all pairwise PB alignments at family and superfamily levels. PBE server is freely available at http://bioinformatics.univ-reunion.fr/ PBE/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Owing to high evolutionary divergence, it is not always possible to identify distantly related protein domains by sequence search techniques. Intermediate sequences possess sequence features of more than one protein and facilitate detection of remotely related proteins. We have demonstrated recently the employment of Cascade PSI-BLAST where we perform PSI-BLAST for many 'generations', initiating searches from new homologues as well. Such a rigorous propagation through generations of PSI-BLAST employs effectively the role of intermediates in detecting distant similarities between proteins. This approach has been tested on a large number of folds and its performance in detecting superfamily level relationships is similar to 35% better than simple PSI-BLAST searches. We present a web server for this search method that permits users to perform Cascade PSI-BLAST searches against the Pfam, SCOP and SwissProt databases. The URL for this server is http://crick.mbu.iisc.ernet.in/similar to CASCADE/CascadeBlast.html.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The unique characteristics of marketspace in combination with the fast growing number of consumers interested in e-commerce have created new research areas of interest to both marketing and consumer behaviour researchers. Consumer behaviour researchers interested in the decision making processes of consumers have two new sets of questions to answer. The first set of questions is related to how useful theories developed for a marketplace are in a marketspace context. Cyber auctions, Internet communities and the possibilities for consumers to establish dialogues not only with companies but also with other consumers make marketspace unique. The effects of these distinctive characteristics on the behaviour of consumers have not been systematically analysed and therefore constitute the second set of questions which have to be studied. Most companies feel that they have to be online even though the effects of being on the Net are not unambiguously positive. The relevance of the relationship marketing paradigm in a marketspace context have to be studied. The relationship enhancement effects of websites from the customers’ point of view are therefore emphasized in this research paper. Representatives of the Net-generation were analysed and the results show that companies should develop marketspace strategies while Net presence has a value-added effect on consumers. The results indicate that the decision making processes of the consumers are also changing as a result of the progress of marketspace

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical properties provide valuable information about the nature and behavior of rocks and minerals. The changes in rock physical properties generate petrophysical contrasts between various lithologies, for example, between shocked and unshocked rocks in meteorite impact structures or between various lithologies in the crust. These contrasts may cause distinct geophysical anomalies, which are often diagnostic to their primary cause (impact, tectonism, etc). This information is vital to understand the fundamental Earth processes, such as impact cratering and associated crustal deformations. However, most of the present day knowledge of changes in rock physical properties is limited due to a lack of petrophysical data of subsurface samples, especially for meteorite impact structures, since they are often buried under post-impact lithologies or eroded. In order to explore the uppermost crust, deep drillings are required. This dissertation is based on the deep drill core data from three impact structures: (i) the Bosumtwi impact structure (diameter 10.5 km, 1.07 Ma age; Ghana), (ii) the Chesapeake Bay impact structure (85 km, 35 Ma; Virginia, U.S.A.), and (iii) the Chicxulub impact structure (180 km, 65 Ma; Mexico). These drill cores have yielded all basic lithologies associated with impact craters such as post-impact lithologies, impact rocks including suevites and breccias, as well as fractured and unfractured target rocks. The fourth study case of this dissertation deals with the data of the Paleoproterozoic Outokumpu area (Finland), as a non-impact crustal case, where a deep drilling through an economically important ophiolite complex was carried out. The focus in all four cases was to combine results of basic petrophysical studies of relevant rocks of these crustal structures in order to identify and characterize various lithologies by their physical properties and, in this way, to provide new input data for geophysical modellings. Furthermore, the rock magnetic and paleomagnetic properties of three impact structures, combined with basic petrophysics, were used to acquire insight into the impact generated changes in rocks and their magnetic minerals, in order to better understand the influence of impact. The obtained petrophysical data outline the various lithologies and divide rocks into four domains. Based on target lithology the physical properties of the unshocked target rocks are controlled by mineral composition or fabric, particularly porosity in sedimentary rocks, while sediments result from diverse sedimentation and diagenesis processes. The impact rocks, such as breccias and suevites, strongly reflect the impact formation mechanism and are distinguishable from the other lithologies by their density, porosity and magnetic properties. The numerous shock features resulting from melting, brecciation and fracturing of the target rocks, can be seen in the changes of physical properties. These features include an increase in porosity and subsequent decrease in density in impact derived units, either an increase or a decrease in magnetic properties (depending on a specific case), as well as large heterogeneity in physical properties. In few cases a slight gradual downward decrease in porosity, as a shock-induced fracturing, was observed. Coupled with rock magnetic studies, the impact generated changes in magnetic fraction the shock-induced magnetic grain size reduction, hydrothermal- or melting-related magnetic mineral alteration, shock demagnetization and shock- or temperature-related remagnetization can be seen. The Outokumpu drill core shows varying velocities throughout the drill core depending on the microcracking and sample conditions. This is similar to observations by Kern et al., (2009), who also reported the velocity dependence on anisotropy. The physical properties are also used to explain the distinct crustal reflectors as observed in seismic reflection studies in the Outokumpu area. According to the seismic velocity data, the interfaces between the diopside-tremolite skarn layer and either serpentinite, mica schist or black schist are causing the strong seismic reflectivities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study wireless multihop energy harvesting sensor networks employed for random field estimation. The sensors sense the random field and generate data that is to be sent to a fusion node for estimation. Each sensor has an energy harvesting source and can operate in two modes: Wake and Sleep. We consider the problem of obtaining jointly optimal power control, routing and scheduling policies that ensure a fair utilization of network resources. This problem has a high computational complexity. Therefore, we develop a computationally efficient suboptimal approach to obtain good solutions to this problem. We study the optimal solution and performance of the suboptimal approach through some numerical examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation develops a strategic management accounting perspective of inventory routing. The thesis studies the drivers of cost efficiency gains by identifying the role of the underlying cost structure, demand, information sharing, forecasting accuracy, service levels, vehicle fleet, planning horizon and other strategic factors as well as the interaction effects among these factors with respect to performance outcomes. The task is to enhance the knowledge of the strategic situations that favor the implementation of inventory routing systems, understanding cause-and-effect relationships, linkages and gaining a holistic view of the value proposition of inventory routing. The thesis applies an exploratory case study design, which is based on normative quantitative empirical research using optimization, simulation and factor analysis. Data and results are drawn from a real world application to cash supply chains. The first research paper shows that performance gains require a common cost component and cannot be explained by simple linear or affine cost structures. Inventory management and distribution decisions become separable in the absence of a set-dependent cost structure, and neither economies of scope nor coordination problems are present in this case. The second research paper analyzes whether information sharing improves the overall forecasting accuracy. Analysis suggests that the potential for information sharing is limited to coordination of replenishments and that central information do not yield more accurate forecasts based on joint forecasting. The third research paper develops a novel formulation of the stochastic inventory routing model that accounts for minimal service levels and forecasting accuracy. The developed model allows studying the interaction of minimal service levels and forecasting accuracy with the underlying cost structure in inventory routing. Interestingly, results show that the factors minimal service level and forecasting accuracy are not statistically significant, and subsequently not relevant for the strategic decision problem to introduce inventory routing, or in other words, to effectively internalize inventory management and distribution decisions at the supplier. Consequently the main contribution of this thesis is the result that cost benefits of inventory routing are derived from the joint decision model that accounts for the underlying set-dependent cost structure rather than the level of information sharing. This result suggests that the value of information sharing of demand and inventory data is likely to be overstated in prior literature. In other words, cost benefits of inventory routing are primarily determined by the cost structure (i.e. level of fixed costs and transportation costs) rather than the level of information sharing, joint forecasting, forecasting accuracy or service levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of the two sampling gate positions, and their widths and the integrator response times on the position, height, and shape of the peaks obtained in a double‐channel gated‐integrator‐based deep‐level transient spectroscopy (DLTS) system are evaluated. The best compromise between the sensitivity and the resolution of the DLTS system is shown to be obtained when the ratio of the two sampling gate positions is about 20. An integrator response time of about 100 ms is shown to be suitable for practical values of emission time constants and heating rates generally used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The barrier height of MIS tunnel diodes is studied considering the effect of deep impurities. It is shown that the barrier height of a given MIS-system can be controlled by changing the density and the activation energy of the defect level. The study leads to the conclusion that deep impurities of character opposite to shallow impurities enhance the barrier height. On the other hand, the barrier height is lowered when the type of the deep impurities is the same as that of shallow impurities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In earlier work, nonisomorphic graphs have been converted into networks to realize Multistage Interconnection networks, which are topologically nonequivalent to the Baseline network. The drawback of this technique is that these nonequivalent networks are not guaranteed to be self-routing, because each node in the graph model can be replaced by a (2 × 2) switch in any one of the four different configurations. Hence, the problem of routing in these networks remains unsolved. Moreover, nonisomorphic graphs were obtained by interconnecting bipartite loops in a heuristic manner; the heuristic nature of this procedure makes it difficult to guarantee full connectivity in large networks. We solve these problems through a direct approach, in which a matrix model for self-routing networks is developed. An example is given to show that this model encompases nonequivalent self-routing networks. This approach has the additional advantage in that the matrix model itself ensures full connectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Characterization of silver- and gold-related defects in gallium arsenide is carried out. These impurities were introduced during the thermal diffusion process and the related defects are characterized by deep-level transient spectroscopy and photoluminescence. The silver-related center in GaAs shows a 0.238 eV photoluminescence line corresponding to no-phonon transition, whereas its thermal ionization energy is found to be 0.426 eV. The thermal activation energy of the gold-related center in GaAs is 0.395 eV, but there is no corresponding luminescence signal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Fifty-six teachers, from four European countries, were interviewed to ascertain their attitudes to and beliefs about the Collaborative Learning Environments (CLEs) which were designed under the Innovative Technologies for Collaborative Learning Project. Their responses were analysed using categories based on a model from cultural-historical activity theory [Engestrom, Y. (1987). Learning by expanding.- An activity-theoretical approach to developmental research. Helsinki: Orienta-Konsultit; Engestrom, Y., Engestrom, R., & Suntio, A. (2002). Can a school community learn to master its own future? An activity-theoretical study of expansive learning among middle school teachers. In G. Wells & G. Claxton (Eds.), Learning for life in the 21st century. Oxford: Blackwell Publishers]. The teachers were positive about CLEs and their possible role in initiating pedagogical innovation and enhancing personal professional development. This positive perception held across cultures and national boundaries. Teachers were aware of the fact that demanding planning was needed for successful implementations of CLEs. However, the specific strategies through which the teachers can guide students' inquiries in CLEs and the assessment of new competencies that may characterize student performance in the CLEs were poorly represented in the teachers' reflections on CLEs. The attitudes and beliefs of the teachers from separate countries had many similarities, but there were also some clear differences, which are discussed in the article. (c) 2005 Elsevier Ltd. All rights reserved."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The major contribution of this paper is to introduce load compatibility constraints in the mathematical model for the capacitated vehicle routing problem with pickup and deliveries. The employee transportation problem in the Indian call centers and transportation of hazardous materials provided the motivation for this variation. In this paper we develop a integer programming model for the vehicle routing problem with load compatibility constraints. Specifically two types of load compatability constraints are introduced, namely mutual exclusion and conditional exclusion. The model is demonstrated with an application from the employee transportation problem in the Indian call centers.