14 resultados para Advanced Encryption Standard
em Dalarna University College Electronic Archive
Resumo:
The rapid development of data transfer through internet made it easier to send the data accurate and faster to the destination. There are many transmission media to transfer the data to destination like e-mails; at the same time it is may be easier to modify and misuse the valuable information through hacking. So, in order to transfer the data securely to the destination without any modifications, there are many approaches like cryptography and steganography. This paper deals with the image steganography as well as with the different security issues, general overview of cryptography, steganography and digital watermarking approaches. The problem of copyright violation of multimedia data has increased due to the enormous growth of computer networks that provides fast and error free transmission of any unauthorized duplicate and possibly manipulated copy of multimedia information. In order to be effective for copyright protection, digital watermark must be robust which are difficult to remove from the object in which they are embedded despite a variety of possible attacks. The message to be send safe and secure, we use watermarking. We use invisible watermarking to embed the message using LSB (Least Significant Bit) steganographic technique. The standard LSB technique embed the message in every pixel, but my contribution for this proposed watermarking, works with the hint for embedding the message only on the image edges alone. If the hacker knows that the system uses LSB technique also, it cannot decrypt correct message. To make my system robust and secure, we added cryptography algorithm as Vigenere square. Whereas the message is transmitted in cipher text and its added advantage to the proposed system. The standard Vigenere square algorithm works with either lower case or upper case. The proposed cryptography algorithm is Vigenere square with extension of numbers also. We can keep the crypto key with combination of characters and numbers. So by using these modifications and updating in this existing algorithm and combination of cryptography and steganography method we develop a secure and strong watermarking method. Performance of this watermarking scheme has been analyzed by evaluating the robustness of the algorithm with PSNR (Peak Signal to Noise Ratio) and MSE (Mean Square Error) against the quality of the image for large amount of data. While coming to see results of the proposed encryption, higher value of 89dB of PSNR with small value of MSE is 0.0017. Then it seems the proposed watermarking system is secure and robust for hiding secure information in any digital system, because this system collect the properties of both steganography and cryptography sciences.
Resumo:
The purpose of this work is to develop a web based decision support system, based onfuzzy logic, to assess the motor state of Parkinson patients on their performance in onscreenmotor tests in a test battery on a hand computer. A set of well defined rules, basedon an expert’s knowledge, were made to diagnose the current state of the patient. At theend of a period, an overall score is calculated which represents the overall state of thepatient during the period. Acceptability of the rules is based on the absolute differencebetween patient’s own assessment of his condition and the diagnosed state. Anyinconsistency can be tracked by highlighted as an alert in the system. Graphicalpresentation of data aims at enhanced analysis of patient’s state and performancemonitoring by the clinic staff. In general, the system is beneficial for the clinic staff,patients, project managers and researchers.
Resumo:
Advanced Building Energy Data Visualization is a way to detect performance problems in commercialbuildings. By placing sensors in a building that collects data from example, air temperature and electricalpower, then makes it possible to calculate the data in Data Visualization software. This softwaregenerates visual diagrams so the building manager or building operator can see if for example thepower consumption is to high.A first step (before sensors are installed in a building) to see how the energy consumption is in abuilding can be to use a Benchmarking Tool. There is a number of Benchmarking Tools that is availablefor free on the Internet. Each tool have a bit different approach, but they all show how much energyconsumption there is in a building compared to other similar buildings.In this study a new web design for the benchmarking tool CalARCH has been developed. CalARCHis developed at the Berkeley Lab in Berkeley, California, USA. CalARCH uses data collected only frombuildings in California, and is only for comparing buildings in California with other similar buildingsin the state.Five different versions of the web site were made. Then a web survey was done to determine whichversion would be the best for CalARCH. The results showed that Version 5 and Version 3 was the best.Then a new version was made, based on these two versions. This study was made at the LawrenceBerkeley Laboratory.
Resumo:
Text messaging is a new form of writing, brought about by technological development in the last couple of decades. Mobile phone usage has increased rapidly worldwide and texting is now part of many people's everyday communcation. A large number of users send or receive texts which include some abbreviations and shortenings, commonly referred to as textspeak. This novel linguistic phenomenon is perceived by some with indifference and by others with aggravation. The following study examines attitudes towards this linguistic change from a gender and age perspective. The comparison between two groups show that the most conservative and least positive to change are young women. The analysis and discussion around this focuses on power, prestige and patterns.
Resumo:
The study reported here is part of a large project for evaluation of the Thermo-Chemical Accumulator (TCA), a technology under development by the Swedish company ClimateWell AB. The studies concentrate on the use of the technology for comfort cooling. This report concentrates on measurements in the laboratory, modelling and system simulation. The TCA is a three-phase absorption heat pump that stores energy in the form of crystallised salt, in this case Lithium Chloride (LiCl) with water being the other substance. The process requires vacuum conditions as with standard absorption chillers using LiBr/water. Measurements were carried out in the laboratories at the Solar Energy Research Center SERC, at Högskolan Dalarna as well as at ClimateWell AB. The measurements at SERC were performed on a prototype version 7:1 and showed that this prototype had several problems resulting in poor and unreliable performance. The main results were that: there was significant corrosion leading to non-condensable gases that in turn caused very poor performance; unwanted crystallisation caused blockages as well as inconsistent behaviour; poor wetting of the heat exchangers resulted in relatively high temperature drops there. A measured thermal COP for cooling of 0.46 was found, which is significantly lower than the theoretical value. These findings resulted in a thorough redesign for the new prototype, called ClimateWell 10 (CW10), which was tested briefly by the authors at ClimateWell. The data collected here was not large, but enough to show that the machine worked consistently with no noticeable vacuum problems. It was also sufficient for identifying the main parameters in a simulation model developed for the TRNSYS simulation environment, but not enough to verify the model properly. This model was shown to be able to simulate the dynamic as well as static performance of the CW10, and was then used in a series of system simulations. A single system model was developed as the basis of the system simulations, consisting of a CW10 machine, 30 m2 flat plate solar collectors with backup boiler and an office with a design cooling load in Stockholm of 50 W/m2, resulting in a 7.5 kW design load for the 150 m2 floor area. Two base cases were defined based on this: one for Stockholm using a dry cooler with design cooling rate of 30 kW; one for Madrid with a cooling tower with design cooling rate of 34 kW. A number of parametric studies were performed based on these two base cases. These showed that the temperature lift is a limiting factor for cooling for higher ambient temperatures and for charging with fixed temperature source such as district heating. The simulated evacuated tube collector performs only marginally better than a good flat plate collector if considering the gross area, the margin being greater for larger solar fractions. For 30 m2 collector a solar faction of 49% and 67% were achieved for the Stockholm and Madrid base cases respectively. The average annual efficiency of the collector in Stockholm (12%) was much lower than that in Madrid (19%). The thermal COP was simulated to be approximately 0.70, but has not been possible to verify with measured data. The annual electrical COP was shown to be very dependent on the cooling load as a large proportion of electrical use is for components that are permanently on. For the cooling loads studied, the annual electrical COP ranged from 2.2 for a 2000 kWh cooling load to 18.0 for a 21000 kWh cooling load. There is however a potential to reduce the electricity consumption in the machine, which would improve these figures significantly. It was shown that a cooling tower is necessary for the Madrid climate, whereas a dry cooler is sufficient for Stockholm although a cooling tower does improve performance. The simulation study was very shallow and has shown a number of areas that are important to study in more depth. One such area is advanced control strategy, which is necessary to mitigate the weakness of the technology (low temperature lift for cooling) and to optimally use its strength (storage).
Resumo:
The thesis belongs to the field of lexical semantics studies, associated with describing the Russian linguistic world-image. The research focuses on the universal situation of purchase and sale as reflected in the Russian lexical standard and sub-standard. The work deals also with subjects related to the sphere of social linguistics: the social stratification of the language, the structure of sub-standard, etc. The thesis is a contribution to the description of the Russian linguistic world-image as well as to the further elaboration of the conceptional analysis method. The results are applicable in teaching Russian as a foreign language, particularly in lexis and Russian culture and mentality studies.
Resumo:
We generalize the standard linear-response (Kubo) theory to obtain the conductivity of a system that is subject to a quantum measurement of the current. Our approach can be used to specifically elucidate how back-action inherent to quantum measurements affects electronic transport. To illustrate the utility of our general formalism, we calculate the frequency-dependent conductivity of graphene and discuss the effect of measurement-induced decoherence on its value in the dc limit. We are able to resolve an ambiguity related to the parametric dependence of the minimal conductivity.
Resumo:
Objective: We present a new evaluation of levodopa plasma concentrations and clinical effects during duodenal infusion of a levodopa/carbidopa gel (Duodopa ) in 12 patients with advanced Parkinson s disease (PD), from a study reported previously (Nyholm et al, Clin Neuropharmacol 2003; 26(3): 156-163). One objective was to investigate in what state of PD we can see the greatest benefits with infusion compared with corresponding oral treatment (Sinemet CR). Another objective was to identify fluctuating response to levodopa and correlate to variables related to disease progression. Methods: We have computed mean absolute error (MAE) and mean squared error (MSE) for the clinical rating from -3 (severe parkinsonism) to +3 (severe dyskinesia) as measures of the clinical state over the treatment periods of the study. Standard deviation (SD) of the rating was used as a measure of response fluctuations. Linear regression and visual inspection of graphs were used to estimate relationships between these measures and variables related to disease progression such as years on levodopa (YLD) or unified PD rating scale part II (UPDRS II).Results: We found that MAE for infusion had a strong linear correlation to YLD (r2=0.80) while the corresponding relation for oral treatment looked more sigmoid, particularly for the more advanced patients (YLD>18).
Resumo:
Objective: To compare results from various tapping tests with diary responses in advanced PD. Background: A home environment test battery for assessing patient state in advanced PD, consisting of diary assessments and motor tests was constructed for a hand computer with touch screen and mobile communication. The diary questions: 1. walking, 2. time in off , on and dyskinetic states, 3. off at worst, 4. dyskinetic at worst, 5. cramps, and 6. satisfied with function, relate to the recent past. Question 7, self-assessment, allows seven steps from -3 ( very off ) to +3 ( very dyskinetic ) and relate to right now. Tapping tests outline: 8. Alternately tapping two fields (un-cued) with right hand 9. Same as 8 but using left hand 10. Tapping an active field (out of two) following a system-generated rhythm (increasing speed) with the dominant hand 11. Tapping an active field (out of four) that randomly changes location when tapped using the dominant hand Methods: 65 patients (currently on Duodopa, or candidates for this treatment) entered diary responses and performed tapping tests four times per day during one to six periods of seven days length. In total there were 224 test periods and 6039 test occasions. Speed for tapping test 10 was discardedand tests 8 and 9 were combined by taking means. Descriptive statistics were used to present the variation of the test variables in relation to self assessment (question 7). Pearson correlation coefficients between speed and accuracy (percent correct) in tapping tests and diary responses were calculated. Results: Mean compliance (percentage completed test occasions per test period) was 83% and the median was 93%. There were large differences in both mean tapping speed and accuracy between the different self-assessed states. Correlations between diary responses and tapping results were small (-0.2 to 0.3, negative values for off-time and dyskinetic-time that had opposite scale directions). Correlations between tapping results were all positive (0.1 to 0.6). Conclusions: The diary responses and tapping results provided different information. The low correlations can partly be explained by the fact that questions related to the past and by random variability, which could be reduced by taking means over test periods. Both tapping speed and accuracy reflect the motor function of the patient to a large extent.
Resumo:
Background: Linkage mapping is used to identify genomic regions affecting the expression of complex traits. However, when experimental crosses such as F2 populations or backcrosses are used to map regions containing a Quantitative Trait Locus (QTL), the size of the regions identified remains quite large, i.e. 10 or more Mb. Thus, other experimental strategies are needed to refine the QTL locations. Advanced Intercross Lines (AIL) are produced by repeated intercrossing of F2 animals and successive generations, which decrease linkage disequilibrium in a controlled manner. Although this approach is seen as promising, both to replicate QTL analyses and fine-map QTL, only a few AIL datasets, all originating from inbred founders, have been reported in the literature. Methods: We have produced a nine-generation AIL pedigree (n = 1529) from two outbred chicken lines divergently selected for body weight at eight weeks of age. All animals were weighed at eight weeks of age and genotyped for SNP located in nine genomic regions where significant or suggestive QTL had previously been detected in the F2 population. In parallel, we have developed a novel strategy to analyse the data that uses both genotype and pedigree information of all AIL individuals to replicate the detection of and fine-map QTL affecting juvenile body weight. Results: Five of the nine QTL detected with the original F2 population were confirmed and fine-mapped with the AIL, while for the remaining four, only suggestive evidence of their existence was obtained. All original QTL were confirmed as a single locus, except for one, which split into two linked QTL. Conclusions: Our results indicate that many of the QTL, which are genome-wide significant or suggestive in the analyses of large intercross populations, are true effects that can be replicated and fine-mapped using AIL. Key factors for success are the use of large populations and powerful statistical tools. Moreover, we believe that the statistical methods we have developed to efficiently study outbred AIL populations will increase the number of organisms for which in-depth complex traits can be analyzed.
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
The aim of this study was to investigate if a telemetry test battery can be used to measure effects of Parkinson’s disease (PD) treatment intervention and disease progression in patients with fluctuations. Sixty-five patients diagnosed with advanced PD were recruited in an open longitudinal 36-month study; 35 treated with levodopa-carbidopa intestinal gel (LCIG) and 30 were candidates for switching from oral PD treatment to LCIG. They utilized a test battery, consisting of self-assessments of symptoms and fine motor tests (tapping and spiral drawings), four times per day in their homes during week-long test periods. The repeated measurements were summarized into an overall test score (OTS) to represent the global condition of the patient during a test period. Clinical assessments included ratings on Unified PD Rating Scale (UPDRS) and 39-item PD Questionnaire (PDQ-39) scales. In LCIG-naïve patients, mean OTS compared to baseline was significantly improved from the first test period on LCIG treatment until month 24. In LCIG-non-naïve patients, there were no significant changes in mean OTS until month 36. The OTS correlated adequately with total UPDRS (rho = 0.59) and total PDQ-39 (0.59). Responsiveness measured as effect size was 0.696 and 0.536 for OTS and UPDRS respectively. The trends of the test scores were similar to the trends of clinical rating scores but dropout rate was high. Correlations between OTS and clinical rating scales were adequate indicating that the test battery contains important elements of the information of well-established scales. The responsiveness and reproducibility were better for OTS than for total UPDRS.