907 resultados para Stream Cipher


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Secure transmission of bulk data is of interest to many content providers. A commercially-viable distribution of content requires technology to prevent unauthorised access. Encryption tools are powerful, but have a performance cost. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Two technical solutions make it possible to perform bulk transmissions while retaining security without too high a performance overhead. These are: 1. a) hierarchical encryption - the stronger the encryption, the harder it is to break but also the more computationally expensive it is. A hierarchical approach to key exchange means that simple and relatively weak encryption and keys are used to encrypt small chunks of data, for example 10 seconds of video. Each chunk has its own key. New keys for this bottom-level encryption are exchanged using a slightly stronger encryption, for example a whole-video key could govern the exchange of the 10-second chunk keys. At a higher level again, there could be daily or weekly keys, securing the exchange of whole-video keys, and at a yet higher level, a subscriber key could govern the exchange of weekly keys. At higher levels, the encryption becomes stronger but is used less frequently, so that the overall computational cost is minimal. The main observation is that the value of each encrypted item determines the strength of the key used to secure it. 2. b) non-symbolic fragmentation with signal diversity - communications are usually assumed to be sent over a single communications medium, and the data to have been encrypted and/or partitioned in whole-symbol packets. Network and path diversity break up a file or data stream into fragments which are then sent over many different channels, either in the same network or different networks. For example, a message could be transmitted partly over the phone network and partly via satellite. While TCP/IP does a similar thing in sending different packets over different paths, this is done for load-balancing purposes and is invisible to the end application. Network and path diversity deliberately introduce the same principle as a secure communications mechanism - an eavesdropper would need to intercept not just one transmission path but all paths used. Non-symbolic fragmentation of data is also introduced to further confuse any intercepted stream of data. This involves breaking up data into bit strings which are subsequently disordered prior to transmission. Even if all transmissions were intercepted, the cryptanalyst still needs to determine fragment boundaries and correctly order them. These two solutions depart from the usual idea of data encryption. Hierarchical encryption is an extension of the combined encryption of systems such as PGP but with the distinction that the strength of encryption at each level is determined by the "value" of the data being transmitted. Non- symbolic fragmentation suppresses or destroys bit patterns in the transmitted data in what is essentially a bit-level transposition cipher but with unpredictable irregularly-sized fragments. Both technologies have applications outside the commercial and can be used in conjunction with other forms of encryption, being functionally orthogonal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, producing ability of electricity by horizontal tidal current turbines and installing possibility of these turbines on bridge's piers in the marine environments has been studied to reduce primary implementation costs and make the plan, economical. To do this and to study its feasibility, the exerted forces from installing horizontal tidal current turbines were compared with the forces applied to the bridge structure during designing process (given in the Standards). Then, the allowable ranges of the overloading which is tolerable by the piers of the bridge were obtained. Accordingly, it is resulted that for installing these turbines, the piers of the existing bridges are required to be strengthened. Because of increasing usage of renewable powers and as a suggestion, the exerted forces from installing turbine for loading coefficients of different Standards are given. Finally as an example, preliminary designing of a horizontal tidal current turbine was carried out for Gesham Channel and the forces exerted from turbine to the bridge's pier were calculated for the future usage in order to create a test site of real dimensions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2015.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

133 p.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several teams of researchers at multiple universities are currently measuring annual and seasonal fluxes of carbon dioxide and other greenhouses gases (nitrous oxide and methane) in riparian wetlands and upland forests in the Tenderfoot Creek Experimental Forest (TCEF), a subalpine watershed in the Little Belt Mountains, Montana. In the current thesis, the author characterized the geochemistry and stable carbon isotope composition of shallow groundwater, soil water, and stream water in upper Stringer Creek, near sites that are being investigated for gas chemistry and microbial studies. It was hypothesized that if methanogenesis were a dominant process in the riparian wetlands of upper Stringer Creek, then this should impart a characteristic signal in the measured stable isotopic composition of dissolved inorganic carbon in shallow groundwater. For the most part, the major solute composition of shallow groundwater in upper Stringer Creek was similar to that of the stream. However, several wells completed in wetland soil had highly elevated concentrations of Fe2+ and Mn2+ which were absent in the well-oxygenated surface water. Use of sediment pore-water samplers (peepers) demonstrated a rapid increase in Fe2+ and Mn2+ with depth, most feasibly explained by microbial reduction of Fe- and Mn-oxide minerals. In general, the pH of shallow groundwater was lower than that of the stream. Since concentrations of CO2 in the groundwater samples were consistently greater than atmospheric pCO2, exchange of CO2 gas across the stream/air interface occurred in one direction, from stream to air. Evasion of CO2 partly explains the higher pH values in the stream. Microbial processes involving breakdown of organic carbon, including aerobic respiration, anaerobic respiration, and methanogenesis, explain the occurrence of excess CO2 in the groundwater. In general, the isotopic composition of total dissolved inorganic carbon (DIC) decreased with increasing DIC concentration, consistent with aerobic and/or anaerobic respiration being the dominant metabolic process in shallow groundwater. However, a minority of wells contained high DIC concentrations that were anomalously heavy in u13C, and these same wells had elevated concentrations of dissolved methane. It is concluded that the wells with isotopically-heavier DIC have likely been influenced by acetoclastic methanogenesis. Results from shallow groundwater wells and one of the peeper samplers suggest a possible link between methanogenesis and bacterial iron reduction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multiple indices of biotic integrity and biological condition gradient models have been developed and validated to assess ecological integrity in the Laurentian Great Lakes Region. With multiple groups such as Tribal, Federal, and State agencies as well as scientists and local watershed management or river-focused volunteer groups collecting data for bioassessment it is important that we determine the comparability of data and the effectiveness of indices applied to these data for assessment of natural systems. We evaluated the applicability of macroinvertebrate and fish community indices for assessing site integrity. Site quality (i.e., habitat condition) could be classified differently depending on which index was applied. This highlights the need to better understand the metrics driving index variation as well as reference conditions for effective communication and use of indices of biotic integrity in the Upper Midwest. We found the macroinvertebrate benthic community index for the Northern Lakes and Forests Ecoregion and a coldwater fish index of biotic integrity for the Upper Midwest were most appropriate for use in the Big Manistee River watershed based on replicate sampling, ability to track trends over time and overall performance. We evaluated three sites where improper road stream crossings (culverts) were improved by replacing them with modern full-span structures using the most appropriate fish and macroinvertebrate IBIs. We used a before-after-control-impact paired series analytical design and found mixed results, with evidence of improvement in biotic integrity based on macroinvertebrate indices at some of the sites while most sites indicated no response in index score. Culvert replacements are often developed based on the potential, or the perception, that they will restore ecological integrity. As restoration practitioners, researchers and managers, we need to be transparent in our goals and objectives and monitor for those results specifically. The results of this research serve as an important model for the broader field of ecosystem restoration and support the argument that while biotic communities can respond to actions undertaken with the goal of overall restoration, practitioners should be realistic in their expectations and claims of predicted benefit, and then effectively evaluate the true impacts of the restoration activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hardboard processing wastewater was evaluated as a feedstock in a bio refinery co-located with the hardboard facility for the production of fuel grade ethanol. A thorough characterization was conducted on the wastewater and the composition changes of which during the process in the bio refinery were tracked. It was determined that the wastewater had a low solid content (1.4%), and hemicellulose was the main component in the solid, accounting for up to 70%. Acid pretreatment alone can hydrolyze the majority of the hemicellulose as well as oligomers, and over 50% of the monomer sugars generated were xylose. The percentage of lignin remained in the liquid increased after acid pretreatment. The characterization results showed that hardboard processing wastewater is a feasible feedstock for the production of ethanol. The optimum conditions to hydrolyze hemicellulose into fermentable sugars were evaluated with a two-stage experiment, which includes acid pretreatment and enzymatic hydrolysis. The experimental data were fitted into second order regression models and Response Surface Methodology (RSM) was employed. The results of the experiment showed that for this type of feedstock enzymatic hydrolysis is not that necessary. In order to reach a comparatively high total sugar concentration (over 45g/l) and low furfural concentration (less than 0.5g/l), the optimum conditions were reached when acid concentration was between 1.41 to 1.81%, and reaction time was 48 to 76 minutes. The two products produced from the bio refinery were compared with traditional products, petroleum gasoline and traditional potassium acetate, in the perspective of sustainability, with greenhouse gas (GHG) emission as an indicator. Three allocation methods, system expansion, mass allocation and market value allocation methods were employed in this assessment. It was determined that the life cycle GHG emissions of ethanol were -27.1, 20.8 and 16 g CO2 eq/MJ, respectively, in the three allocation methods, whereas that of petroleum gasoline is 90 g CO2 eq/MJ. The life cycle GHG emissions of potassium acetate in mass allocation and market value allocation method were 555.7 and 716.0 g CO2 eq/kg, whereas that of traditional potassium acetate is 1020 g CO2/kg.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as f-test is performed during each node’s split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Value-Stream mapping (VSM) is a helpful tool to identify waste and improvement areas. It has emerged as a preferred way to support and implement the lean approach. While lean principles are well-established and have broad applicability in manufacturing, their extension to information technology is still limited. Based on a case study approach, this paper presents the implementation of VSM in an IT firm as a lean IT improvement initiative. It involves mapping the current activities of the firm and identifying opportunities for improvement. After several interviews with employees who are currently involved in the process, current state map is prepared to describe the existing problem areas. Future state map is prepared to show the proposed improvement action plans. The achievements of VSM implementation are reduction in lead time, cycle time and resources. Our finding indicates that, with the new process change, total lead time can be reduced from 20 days to 3 days – 92% reduction in overall lead time for database provisioning process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring of nitrogen and phosphorus in streams and rivers throughout Iowa is an essential element of the Iowa Nutrient Reduction Strategy (INRS). Sampling and analysis of surface water is necessary to develop periodic estimates of the amounts of nitrogen and phosphorus transported from Iowa. Surface and groundwater monitoring provides the scientific evidence needed to document the effectiveness of nutrient reduction practices and the impact they have on water quality. Lastly, monitoring data informs decisions about where and how best to implement nutrient reduction practices, by both point sources and nonpoint sources, to provide the greatest benefit at the least cost. The impetus for this report comes from the Water Resources Coordination Council (WRCC) which states in its 2014‐15 Annual Report “Efforts are underway to improve understanding of the multiple nutrient monitoring efforts that may be available and can be compared to the nutrient WQ monitoring framework to identify opportunities and potential data gaps to better coordinate and prioritize future nutrient monitoring efforts.” This report is the culmination of those efforts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Iowa Nutrient Reduction Strategy (NRS) is a research- and technology-based approach to assess and reduce nutrients—nitrogen and phosphorus—delivered to Iowa waterways and the Gulf of Mexico by 45 percent. To measure progress, researchers track many different factors, from inputs (e.g. funding) and the human domain (e.g. farmer perspectives) to land management (e.g. on-farm practices) and water quality. Monitoring Iowa streams provides valuable insight into measuring water quality progress and the reduction of surface water nutrient loss. The Iowa Nutrient Reduction Strategy (NRS) aims to reduce the load, or total amount (e.g. tons), of nutrients lost annually. Researchers calculate the load from water monitoring results, which measure concentration combined with stream flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Iowa’s rivers are constantly shifting and changing and can be challenging places to design, construct, and maintain water trails. This section discusses aspects you will immediately encounter when developing a water trail: launches, parking areas, and trails. The intended users and expected use suggest how these amenities are designed and constructed. Water trails intended for extended families, for example, are designed differently from those intended for experienced paddlers on multi-day trips.