998 resultados para data backup


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the paper through extensive study and design, the technical plan for establishing the exploration database center is made to combine imported and self developed techniques. By research and repeated experiment a modern database center has been set up with its hardware and network having advanced performance, its system well configured, its data store and management complete, and its data support being fast and direct. Through study on the theory, method and model of decision an exploration decision assistant schema is designed with one decision plan of well location decision support system being evaluated and put into action. 1. Study on the establishment of Shengli exploration database center Research is made on the hardware configuration of the database center including its workstations and all connected hardware and system. The hardware of the database center is formed by connecting workstations, microcomputer workstations, disk arrays, and those equipments used for seismic processing and interpretation. Research on the data store and management includes the analysis of the contents to be managed, data flow, data standard, data QC, data backup and restore policy, optimization of database system. A reasonable data management regulation and workflow is made and the scientific exploration data management system is created. Data load is done by working out a schedule firstly and at last 200 more projects of seismic surveys has been loaded amount to 25TB. 2. Exploration work support system and its application Seismic data processing system support has the following features, automatic extraction of seismic attributes, GIS navigation, data order, extraction of any sized data cube, pseudo huge capacity disk array, standard output exchange format etc. The prestack data can be accessed by the processing system or data can be transferred to other processing system through standard exchange format. For supporting seismic interpretation system the following features exist such as auto scan and store of interpretation result, internal data quality control etc. the interpretation system is connected directly with database center to get real time support of seismic data, formation data and well data. Comprehensive geological study support is done through intranet with the ability to query or display data graphically on the navigation system under some geological constraints. Production management support system is mainly used to collect, analyze and display production data with its core technology on the controlled data collection and creation of multiple standard forms. 3. exploration decision support system design By classification of workflow and data flow of all the exploration stages and study on decision theory and method, target of each decision step, decision model and requirement, three concept models has been formed for the Shengli exploration decision support system including the exploration distribution support system, the well location support system and production management support system. the well location decision support system has passed evaluation and been put into action. 4. Technical advance Hardware and software match with high performance for the database center. By combining parallel computer system, database server, huge capacity ATL, disk array, network and firewall together to create the first exploration database center in China with reasonable configuration, high performance and able to manage the whole data sets of exploration. Huge exploration data management technology is formed where exploration data standards and management regulations are made to guarantee data quality, safety and security. Multifunction query and support system for comprehensive exploration information support. It includes support system for geological study, seismic processing and interpretation and production management. In the system a lot of new database and computer technology have been used to provide real time information support for exploration work. Finally is the design of Shengli exploration decision support system. 5. Application and benefit Data storage has reached the amount of 25TB with thousand of users in Shengli oil field to access data to improve work efficiency multiple times. The technology has also been applied by many other units of SINOPEC. Its application of providing data to a project named Exploration achievements and Evaluation of Favorable Targets in Hekou Area shortened the data preparation period from 30 days to 2 days, enriching data abundance 15 percent and getting information support from the database center perfectly. Its application to provide former processed result for a project named Pre-stack depth migration in Guxi fracture zone reduced the amount of repeated process and shortened work period of one month and improved processing precision and quality, saving capital investment of data processing of 30 million yuan. It application by providing project database automatically in project named Geological and seismic study of southern slope zone of Dongying Sag shortened data preparation time so that researchers have more time to do research, thus to improve interpretation precision and quality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose Many methodologies exist to assess the security risks associated with unauthorized leakage, modification and interruption of information used by organisations. This paper argues that these methodologies have a traditional orientation towards the identification and assessment of technical information assets. This obscures key risks associated with the cultivation and deployment of organisational knowledge. The purpose of this paper is to explore how security risk assessment methods can more effectively identify and treat the knowledge associated with business processes.

Design/methodology/approach – The argument was developed through an illustrative case study in which a well-documented traditional methodology is applied to a complex data backup process. Follow-up interviews were conducted with the organisation’s security managers to explore the results of the assessment and the nature of knowledge “assets” within a business process.

Findings – It was discovered that the backup process depended, in subtle and often informal ways, on tacit knowledge to sustain operational complexity, handle exceptions and make frequent interventions. Although typical information security methodologies identify people as critical assets, this study suggests a new approach might draw on more detailed accounts of individual knowledge, collective knowledge and their relationship to organisational processes.

Originality/value – Drawing on the knowledge management literature, the paper suggests mechanisms to incorporate these knowledge-based considerations into the scope of information security risk methodologies. A knowledge protection model is presented as a result of this research. This model outlines ways in which organisations can effectively identify and treat risks around process knowledge critical to the business.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Varmuuskopiointi ja tietoturva suomalaisessa mikroyrityksessä ovat asioita, joihin ei usein kiinnitetä riittävää huomiota puuttuvan osaamisen, kiireen tai liian vähäisten resurssien takia. Tietoturva on valittu erääksi työn tutkimusaiheeksi, koska se on ajankohtainen ja paljon puhuttu aihe. Toiseksi tutkimusaiheeksi on valittu varmuuskopiointi, sillä se liittyy hyvin vahvasti tietoturvaan ja se on pakollinen toimenpide yrityksen liiketoiminnan jatkuvuuden takaamiseksi. Tässä työssä tutkitaan mikroyrityksen tietoturvaa ja pohditaan, miten sitä voidaan parantaa yksinkertaisilla menetelmillä. Tämän lisäksi tarkastellaan mikroyrityksen varmuuskopiointia ja siihen liittyviä asioita ja ongelmia. Työn tavoitteena on tietoturvan ja varmuuskopioinnin tutkiminen yleisellä tasolla sekä useamman varmuuskopiointiratkaisuvaihtoehdon luominen kirjallisuuden ja teorian pohjalta. Työssä tarkastellaan yrityksen tietoturvaa ja varmuuskopiointia käyttäen hyväksi kuvitteellista malliyritystä tutkimusympäristönä, koska tällä tavalla tutkimusympäristö voidaan määritellä ja rajata tarkasti. Koska kyseiset aihealueet ovat varsin laajoja, on työn aihetta rajattu lähinnä varmuuskopiointiin, mahdollisiin tietoturvauhkiin ja tietoturvan tutkimiseen yleisellä tasolla. Tutkimuksen pohjalta on kehitetty kaksi mahdollista paikallisen varmuuskopioinnin ratkaisuvaihtoehtoa ja yksi etävarmuuskopiointiratkaisuvaihtoehto. Paikallisen varmuuskopioinnin ratkaisuvaihtoehdot ovat varmuuskopiointi ulkoiselle kovalevylle ja varmuuskopiointi NAS (Network Attached Storage) -verkkolevypalvelimelle. Etävarmuuskopiointiratkaisuvaihtoehto on varmuuskopiointi etäpalvelimelle, kuten pilvipalveluun. Vaikka NAS-verkkolevypalvelin on paikallisen varmuuskopioinnin ratkaisu, voidaan sitä myös käyttää etävarmuuskopiointiin riippuen laitteen sijainnista. Työssä vertaillaan ja arvioidaan lyhyesti ratkaisuvaihtoehtoja tutkimuksen pohjalta luoduilla arviointikriteereillä. Samalla esitellään pisteytysmalli ratkaisujen arvioinnin ja sopivan ratkaisuvaihtoehdon valitsemisen helpottamiseksi. Jokaisessa ratkaisuvaihtoehdossa on omat hyvät ja huonot puolensa, joten oikean ratkaisuvaihtoehdon valitseminen ei ole aina helppoa. Ratkaisuvaihtoehtojen sopivuus tietylle yritykselle riippuu aina yrityksen omista tarpeista ja vaatimuksista. Koska eri yrityksillä on usein erilaiset vaatimukset ja tarpeet varmuuskopioinnille, voi yritykselle parhaiten sopivan varmuuskopiointiratkaisun löytäminen olla vaikeaa ja aikaa vievää. Tässä työssä esitetyt ratkaisuvaihtoehdot toimivat ohjeena ja perustana mikroyrityksen varmuuskopioinnin suunnittelussa, valinnassa, päätöksen teossa ja järjestelmän rakentamisessa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A share of information on why & how to make copies of data so that these additional copies may be used to restore the original after a data loss event. Burn cd's and copy to memory sticks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite significant advancements in wireless sensor networks (WSNs), energy conservation remains one of the most important research challenges. Proper organization of nodes (clustering) is one of the major techniques to expand the lifespan of the whole network through aggregating data at the cluster head. The cluster head is the backbone of the entire cluster. That means if a cluster head fails to accomplish its function, the received and collected data by cluster head can be lost. Moreover, the energy consumption following direct communications from sources to base stations will be increased. In this paper, we propose a type-2 fuzzy based self-configurable cluster head selection (SCCH) approach to not only consider the selection criterion of the cluster head but also present the cluster backup approach. Thus, in case of cluster failure, the system still works in an efficient way. The novelty of this protocol is the ability of handling communication uncertainty, which is an inherent operational aspect of sensor networks. The experiment results indicate SCCH performs better than other recently developed methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents the data-rich findings of an experiment with enlisting patron-driven/demand-driven acquisitions (DDA) of ebooks in two ways. The first experiment entailed comparison of DDA eBook usage against newly ordered hardcopy materials’ circulation, both overall and ebook vs. print usage within the same subject areas. Secondly, this study experimented with DDA ebooks as a backup plan for unfunded requests left over at the end of the fiscal year.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents the data-rich findings of an experiment with enlisting patron-driven/demand-driven acquisitions (DDA) of ebooks in two ways. The first experiment entailed comparison of DDA eBook usage against newly ordered hardcopy materials’ circulation, both overall and ebook vs. print usage within the same subject areas. Secondly, this study experimented with DDA ebooks as a backup plan for unfunded requests left over at the end of the fiscal year.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data deduplication describes a class of approaches that reduce the storage capacity needed to store data or the amount of data that has to be transferred over a network. These approaches detect coarse-grained redundancies within a data set, e.g. a file system, and remove them.rnrnOne of the most important applications of data deduplication are backup storage systems where these approaches are able to reduce the storage requirements to a small fraction of the logical backup data size.rnThis thesis introduces multiple new extensions of so-called fingerprinting-based data deduplication. It starts with the presentation of a novel system design, which allows using a cluster of servers to perform exact data deduplication with small chunks in a scalable way.rnrnAfterwards, a combination of compression approaches for an important, but often over- looked, data structure in data deduplication systems, so called block and file recipes, is introduced. Using these compression approaches that exploit unique properties of data deduplication systems, the size of these recipes can be reduced by more than 92% in all investigated data sets. As file recipes can occupy a significant fraction of the overall storage capacity of data deduplication systems, the compression enables significant savings.rnrnA technique to increase the write throughput of data deduplication systems, based on the aforementioned block and file recipes, is introduced next. The novel Block Locality Caching (BLC) uses properties of block and file recipes to overcome the chunk lookup disk bottleneck of data deduplication systems. This chunk lookup disk bottleneck either limits the scalability or the throughput of data deduplication systems. The presented BLC overcomes the disk bottleneck more efficiently than existing approaches. Furthermore, it is shown that it is less prone to aging effects.rnrnFinally, it is investigated if large HPC storage systems inhibit redundancies that can be found by fingerprinting-based data deduplication. Over 3 PB of HPC storage data from different data sets have been analyzed. In most data sets, between 20 and 30% of the data can be classified as redundant. According to these results, future work in HPC storage systems should further investigate how data deduplication can be integrated into future HPC storage systems.rnrnThis thesis presents important novel work in different area of data deduplication re- search.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adult male and female emperor penguins (Aptenodytes forsteri) were fitted with satellite transmitters at Pointe-Géologie (Adélie Land), Dumont d'Urville Sea coast, in November 2005. Nine of 30 data sets were selected for analyses to investigate the penguins' diving behaviour at high resolution (doi:10.1594/PANGAEA.633708, doi:10.1594/PANGAEA.633709, doi:10.1594/PANGAEA.633710, doi:10.1594/PANGAEA.633711). The profiles are in synchrony with foraging trips of the birds during austral spring (doi:10.1594/PANGAEA.472171, doi:10.1594/PANGAEA.472173, doi:10.1594/PANGAEA.472164, doi:10.1594/PANGAEA.472160, doi:10.1594/PANGAEA.472161). Corresponding high resolution winter data (n = 5; archived elsewhere) were provided by A. Ancel, Centre d'Ecologie et Physiologie Energétiques, CNRS, Strasbourg, France. Air-breathing divers tend to increase their overall dive duration with increasing dive depth. In most penguin species, this occurs due to increasing transit (descent and ascent) durations but also because the duration of the bottom phase of the dive increases with increasing depth. We interpreted the efficiency with which emperor penguins can exploit different diving depths by analysing dive depth profile data of nine birds studied during the early and late chick-rearing period in Adélie Land, Antarctica. Another eight datasets of dive depth and duration frequency recordings (doi:10.1594/PANGAEA.472150, doi:10.1594/PANGAEA.472152, doi:10.1594/PANGAEA.472154, doi:10.1594/PANGAEA.472155, doi:10.1594/PANGAEA.472142, doi:10.1594/PANGAEA.472144, doi:10.1594/PANGAEA.472146, doi:10.1594/PANGAEA.472147), which backup the analysed high resolution depth profile data, and dive depth and duration frequency recordings of another bird (doi:10.1594/PANGAEA.472156, doi:10.1594/PANGAEA.472148) did not match the requirement of high resolution for analyses. Eleven additional data sets provide information on the overall foraging distribution of emperor penguins during the period analysed (doi:10.1594/PANGAEA.472157, doi:10.1594/PANGAEA.472158, doi:10.1594/PANGAEA.472162, doi:10.1594/PANGAEA.472163, doi:10.1594/PANGAEA.472166, doi:10.1594/PANGAEA.472167, doi:10.1594/PANGAEA.472168, doi:10.1594/PANGAEA.472170, doi:10.1594/PANGAEA.472172, doi:10.1594/PANGAEA.472174, doi:10.1594/PANGAEA.472175).

Relevância:

20.00% 20.00%

Publicador: