905 resultados para Advanced Encryption Standard
Resumo:
The advent of the Auger Engineering Radio Array (AERA) necessitates the development of a powerful framework for the analysis of radio measurements of cosmic ray air showers. As AERA performs ""radio-hybrid"" measurements of air shower radio emission in coincidence with the surface particle detectors and fluorescence telescopes of the Pierre Auger Observatory, the radio analysis functionality had to be incorporated in the existing hybrid analysis solutions for fluorescence and surface detector data. This goal has been achieved in a natural way by extending the existing Auger Offline software framework with radio functionality. In this article, we lay out the design, highlights and features of the radio extension implemented in the Auger Offline framework. Its functionality has achieved a high degree of sophistication and offers advanced features such as vectorial reconstruction of the electric field, advanced signal processing algorithms, a transparent and efficient handling of FFTs, a very detailed simulation of detector effects, and the read-in of multiple data formats including data from various radio simulation codes. The source code of this radio functionality can be made available to interested parties on request. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This article presents important properties of standard discrete distributions and its conjugate densities. The Bernoulli and Poisson processes are described as generators of such discrete models. A characterization of distributions by mixtures is also introduced. This article adopts a novel singular notation and representation. Singular representations are unusual in statistical texts. Nevertheless, the singular notation makes it simpler to extend and generalize theoretical results and greatly facilitates numerical and computational implementation.
Resumo:
This paper describes the applications of anew carbon paste electrode containing fibers of coconut (Cocus nucifera L) fruit, which are very rich in peroxidase enzymes naturally immobilized on its structure. The new sensor was applied for the amperometric quantification of benzoyl peroxide in facial creams and dermatological shampoos. The amperometric measurements were performed in 0.1 mol L(-1) phosphate buffer (pH 5.2), at 0.0 V (versus Ag/AgCl). On these conditions, benzoyl peroxide was rapidly determined in the 5.0-55 mu mol L(-1), with a detection limit of 2.5 mu mol L(-1) (s/n = 3), response time of 4.1 s (90% of the steady state) and sensitivity limit of 0.33 A mol L(-1) cm(-2). The amperometric results are in good agreement with those obtained by spectrophotometric technique, used as a standard method. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A dynamic atmosphere generator with a naphthalene emission source has been constructed and used for the development and evaluation of a bioluminescence sensor based on the bacteria Pseudomonas fluorescens HK44 immobilized in 2% agar gel (101 cell mL(-1)) placed in sampling tubes. A steady naphthalene emission rate (around 7.3 nmol min(-1) at 27 degrees C and 7.4 mLmin(-1) of purified air) was obtained by covering the diffusion unit containing solid naphthalene with a PTFE filter membrane. The time elapsed from gelation of the agar matrix to analyte exposure (""maturation time"") was found relevant for the bioluminescence assays, being most favorable between 1.5 and 3 h. The maximum light emission, observed after 80 min, is dependent on the analyte concentration and the exposure time (evaluated between 5 and 20 min), but not on the flow rate of naphthalene in the sampling tube, over the range of 1.8-7.4 nmol min(-1). A good linear response was obtained between 50 and 260 nmol L-1 with a limit of detection estimated in 20 nmol L-1 far below the recommended threshold limit value for naphthalene in air. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The purpose of this work is to develop a web based decision support system, based onfuzzy logic, to assess the motor state of Parkinson patients on their performance in onscreenmotor tests in a test battery on a hand computer. A set of well defined rules, basedon an expert’s knowledge, were made to diagnose the current state of the patient. At theend of a period, an overall score is calculated which represents the overall state of thepatient during the period. Acceptability of the rules is based on the absolute differencebetween patient’s own assessment of his condition and the diagnosed state. Anyinconsistency can be tracked by highlighted as an alert in the system. Graphicalpresentation of data aims at enhanced analysis of patient’s state and performancemonitoring by the clinic staff. In general, the system is beneficial for the clinic staff,patients, project managers and researchers.
Resumo:
Advanced Building Energy Data Visualization is a way to detect performance problems in commercialbuildings. By placing sensors in a building that collects data from example, air temperature and electricalpower, then makes it possible to calculate the data in Data Visualization software. This softwaregenerates visual diagrams so the building manager or building operator can see if for example thepower consumption is to high.A first step (before sensors are installed in a building) to see how the energy consumption is in abuilding can be to use a Benchmarking Tool. There is a number of Benchmarking Tools that is availablefor free on the Internet. Each tool have a bit different approach, but they all show how much energyconsumption there is in a building compared to other similar buildings.In this study a new web design for the benchmarking tool CalARCH has been developed. CalARCHis developed at the Berkeley Lab in Berkeley, California, USA. CalARCH uses data collected only frombuildings in California, and is only for comparing buildings in California with other similar buildingsin the state.Five different versions of the web site were made. Then a web survey was done to determine whichversion would be the best for CalARCH. The results showed that Version 5 and Version 3 was the best.Then a new version was made, based on these two versions. This study was made at the LawrenceBerkeley Laboratory.
Resumo:
Text messaging is a new form of writing, brought about by technological development in the last couple of decades. Mobile phone usage has increased rapidly worldwide and texting is now part of many people's everyday communcation. A large number of users send or receive texts which include some abbreviations and shortenings, commonly referred to as textspeak. This novel linguistic phenomenon is perceived by some with indifference and by others with aggravation. The following study examines attitudes towards this linguistic change from a gender and age perspective. The comparison between two groups show that the most conservative and least positive to change are young women. The analysis and discussion around this focuses on power, prestige and patterns.
Resumo:
The study reported here is part of a large project for evaluation of the Thermo-Chemical Accumulator (TCA), a technology under development by the Swedish company ClimateWell AB. The studies concentrate on the use of the technology for comfort cooling. This report concentrates on measurements in the laboratory, modelling and system simulation. The TCA is a three-phase absorption heat pump that stores energy in the form of crystallised salt, in this case Lithium Chloride (LiCl) with water being the other substance. The process requires vacuum conditions as with standard absorption chillers using LiBr/water. Measurements were carried out in the laboratories at the Solar Energy Research Center SERC, at Högskolan Dalarna as well as at ClimateWell AB. The measurements at SERC were performed on a prototype version 7:1 and showed that this prototype had several problems resulting in poor and unreliable performance. The main results were that: there was significant corrosion leading to non-condensable gases that in turn caused very poor performance; unwanted crystallisation caused blockages as well as inconsistent behaviour; poor wetting of the heat exchangers resulted in relatively high temperature drops there. A measured thermal COP for cooling of 0.46 was found, which is significantly lower than the theoretical value. These findings resulted in a thorough redesign for the new prototype, called ClimateWell 10 (CW10), which was tested briefly by the authors at ClimateWell. The data collected here was not large, but enough to show that the machine worked consistently with no noticeable vacuum problems. It was also sufficient for identifying the main parameters in a simulation model developed for the TRNSYS simulation environment, but not enough to verify the model properly. This model was shown to be able to simulate the dynamic as well as static performance of the CW10, and was then used in a series of system simulations. A single system model was developed as the basis of the system simulations, consisting of a CW10 machine, 30 m2 flat plate solar collectors with backup boiler and an office with a design cooling load in Stockholm of 50 W/m2, resulting in a 7.5 kW design load for the 150 m2 floor area. Two base cases were defined based on this: one for Stockholm using a dry cooler with design cooling rate of 30 kW; one for Madrid with a cooling tower with design cooling rate of 34 kW. A number of parametric studies were performed based on these two base cases. These showed that the temperature lift is a limiting factor for cooling for higher ambient temperatures and for charging with fixed temperature source such as district heating. The simulated evacuated tube collector performs only marginally better than a good flat plate collector if considering the gross area, the margin being greater for larger solar fractions. For 30 m2 collector a solar faction of 49% and 67% were achieved for the Stockholm and Madrid base cases respectively. The average annual efficiency of the collector in Stockholm (12%) was much lower than that in Madrid (19%). The thermal COP was simulated to be approximately 0.70, but has not been possible to verify with measured data. The annual electrical COP was shown to be very dependent on the cooling load as a large proportion of electrical use is for components that are permanently on. For the cooling loads studied, the annual electrical COP ranged from 2.2 for a 2000 kWh cooling load to 18.0 for a 21000 kWh cooling load. There is however a potential to reduce the electricity consumption in the machine, which would improve these figures significantly. It was shown that a cooling tower is necessary for the Madrid climate, whereas a dry cooler is sufficient for Stockholm although a cooling tower does improve performance. The simulation study was very shallow and has shown a number of areas that are important to study in more depth. One such area is advanced control strategy, which is necessary to mitigate the weakness of the technology (low temperature lift for cooling) and to optimally use its strength (storage).
Resumo:
The thesis belongs to the field of lexical semantics studies, associated with describing the Russian linguistic world-image. The research focuses on the universal situation of purchase and sale as reflected in the Russian lexical standard and sub-standard. The work deals also with subjects related to the sphere of social linguistics: the social stratification of the language, the structure of sub-standard, etc. The thesis is a contribution to the description of the Russian linguistic world-image as well as to the further elaboration of the conceptional analysis method. The results are applicable in teaching Russian as a foreign language, particularly in lexis and Russian culture and mentality studies.
Resumo:
We generalize the standard linear-response (Kubo) theory to obtain the conductivity of a system that is subject to a quantum measurement of the current. Our approach can be used to specifically elucidate how back-action inherent to quantum measurements affects electronic transport. To illustrate the utility of our general formalism, we calculate the frequency-dependent conductivity of graphene and discuss the effect of measurement-induced decoherence on its value in the dc limit. We are able to resolve an ambiguity related to the parametric dependence of the minimal conductivity.
Resumo:
Objective: We present a new evaluation of levodopa plasma concentrations and clinical effects during duodenal infusion of a levodopa/carbidopa gel (Duodopa ) in 12 patients with advanced Parkinson s disease (PD), from a study reported previously (Nyholm et al, Clin Neuropharmacol 2003; 26(3): 156-163). One objective was to investigate in what state of PD we can see the greatest benefits with infusion compared with corresponding oral treatment (Sinemet CR). Another objective was to identify fluctuating response to levodopa and correlate to variables related to disease progression. Methods: We have computed mean absolute error (MAE) and mean squared error (MSE) for the clinical rating from -3 (severe parkinsonism) to +3 (severe dyskinesia) as measures of the clinical state over the treatment periods of the study. Standard deviation (SD) of the rating was used as a measure of response fluctuations. Linear regression and visual inspection of graphs were used to estimate relationships between these measures and variables related to disease progression such as years on levodopa (YLD) or unified PD rating scale part II (UPDRS II).Results: We found that MAE for infusion had a strong linear correlation to YLD (r2=0.80) while the corresponding relation for oral treatment looked more sigmoid, particularly for the more advanced patients (YLD>18).
Resumo:
Objective: To compare results from various tapping tests with diary responses in advanced PD. Background: A home environment test battery for assessing patient state in advanced PD, consisting of diary assessments and motor tests was constructed for a hand computer with touch screen and mobile communication. The diary questions: 1. walking, 2. time in off , on and dyskinetic states, 3. off at worst, 4. dyskinetic at worst, 5. cramps, and 6. satisfied with function, relate to the recent past. Question 7, self-assessment, allows seven steps from -3 ( very off ) to +3 ( very dyskinetic ) and relate to right now. Tapping tests outline: 8. Alternately tapping two fields (un-cued) with right hand 9. Same as 8 but using left hand 10. Tapping an active field (out of two) following a system-generated rhythm (increasing speed) with the dominant hand 11. Tapping an active field (out of four) that randomly changes location when tapped using the dominant hand Methods: 65 patients (currently on Duodopa, or candidates for this treatment) entered diary responses and performed tapping tests four times per day during one to six periods of seven days length. In total there were 224 test periods and 6039 test occasions. Speed for tapping test 10 was discardedand tests 8 and 9 were combined by taking means. Descriptive statistics were used to present the variation of the test variables in relation to self assessment (question 7). Pearson correlation coefficients between speed and accuracy (percent correct) in tapping tests and diary responses were calculated. Results: Mean compliance (percentage completed test occasions per test period) was 83% and the median was 93%. There were large differences in both mean tapping speed and accuracy between the different self-assessed states. Correlations between diary responses and tapping results were small (-0.2 to 0.3, negative values for off-time and dyskinetic-time that had opposite scale directions). Correlations between tapping results were all positive (0.1 to 0.6). Conclusions: The diary responses and tapping results provided different information. The low correlations can partly be explained by the fact that questions related to the past and by random variability, which could be reduced by taking means over test periods. Both tapping speed and accuracy reflect the motor function of the patient to a large extent.
Resumo:
Background: Linkage mapping is used to identify genomic regions affecting the expression of complex traits. However, when experimental crosses such as F2 populations or backcrosses are used to map regions containing a Quantitative Trait Locus (QTL), the size of the regions identified remains quite large, i.e. 10 or more Mb. Thus, other experimental strategies are needed to refine the QTL locations. Advanced Intercross Lines (AIL) are produced by repeated intercrossing of F2 animals and successive generations, which decrease linkage disequilibrium in a controlled manner. Although this approach is seen as promising, both to replicate QTL analyses and fine-map QTL, only a few AIL datasets, all originating from inbred founders, have been reported in the literature. Methods: We have produced a nine-generation AIL pedigree (n = 1529) from two outbred chicken lines divergently selected for body weight at eight weeks of age. All animals were weighed at eight weeks of age and genotyped for SNP located in nine genomic regions where significant or suggestive QTL had previously been detected in the F2 population. In parallel, we have developed a novel strategy to analyse the data that uses both genotype and pedigree information of all AIL individuals to replicate the detection of and fine-map QTL affecting juvenile body weight. Results: Five of the nine QTL detected with the original F2 population were confirmed and fine-mapped with the AIL, while for the remaining four, only suggestive evidence of their existence was obtained. All original QTL were confirmed as a single locus, except for one, which split into two linked QTL. Conclusions: Our results indicate that many of the QTL, which are genome-wide significant or suggestive in the analyses of large intercross populations, are true effects that can be replicated and fine-mapped using AIL. Key factors for success are the use of large populations and powerful statistical tools. Moreover, we believe that the statistical methods we have developed to efficiently study outbred AIL populations will increase the number of organisms for which in-depth complex traits can be analyzed.