908 resultados para ASHRAE Standard 55
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is interest in studying latent variables. These latent variables are directly considered in the Item Response Models (IRM) and they are usually called latent traits. A usual assumption for parameter estimation of the IRM, considering one group of examinees, is to assume that the latent traits are random variables which follow a standard normal distribution. However, many works suggest that this assumption does not apply in many cases. Furthermore, when this assumption does not hold, the parameter estimates tend to be biased and misleading inference can be obtained. Therefore, it is important to model the distribution of the latent traits properly. In this paper we present an alternative latent traits modeling based on the so-called skew-normal distribution; see Genton (2004). We used the centred parameterization, which was proposed by Azzalini (1985). This approach ensures the model identifiability as pointed out by Azevedo et al. (2009b). Also, a Metropolis Hastings within Gibbs sampling (MHWGS) algorithm was built for parameter estimation by using an augmented data approach. A simulation study was performed in order to assess the parameter recovery in the proposed model and the estimation method, and the effect of the asymmetry level of the latent traits distribution on the parameter estimation. Also, a comparison of our approach with other estimation methods (which consider the assumption of symmetric normality for the latent traits distribution) was considered. The results indicated that our proposed algorithm recovers properly all parameters. Specifically, the greater the asymmetry level, the better the performance of our approach compared with other approaches, mainly in the presence of small sample sizes (number of examinees). Furthermore, we analyzed a real data set which presents indication of asymmetry concerning the latent traits distribution. The results obtained by using our approach confirmed the presence of strong negative asymmetry of the latent traits distribution. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This article presents important properties of standard discrete distributions and its conjugate densities. The Bernoulli and Poisson processes are described as generators of such discrete models. A characterization of distributions by mixtures is also introduced. This article adopts a novel singular notation and representation. Singular representations are unusual in statistical texts. Nevertheless, the singular notation makes it simpler to extend and generalize theoretical results and greatly facilitates numerical and computational implementation.
Resumo:
This paper describes the applications of anew carbon paste electrode containing fibers of coconut (Cocus nucifera L) fruit, which are very rich in peroxidase enzymes naturally immobilized on its structure. The new sensor was applied for the amperometric quantification of benzoyl peroxide in facial creams and dermatological shampoos. The amperometric measurements were performed in 0.1 mol L(-1) phosphate buffer (pH 5.2), at 0.0 V (versus Ag/AgCl). On these conditions, benzoyl peroxide was rapidly determined in the 5.0-55 mu mol L(-1), with a detection limit of 2.5 mu mol L(-1) (s/n = 3), response time of 4.1 s (90% of the steady state) and sensitivity limit of 0.33 A mol L(-1) cm(-2). The amperometric results are in good agreement with those obtained by spectrophotometric technique, used as a standard method. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A dynamic atmosphere generator with a naphthalene emission source has been constructed and used for the development and evaluation of a bioluminescence sensor based on the bacteria Pseudomonas fluorescens HK44 immobilized in 2% agar gel (101 cell mL(-1)) placed in sampling tubes. A steady naphthalene emission rate (around 7.3 nmol min(-1) at 27 degrees C and 7.4 mLmin(-1) of purified air) was obtained by covering the diffusion unit containing solid naphthalene with a PTFE filter membrane. The time elapsed from gelation of the agar matrix to analyte exposure (""maturation time"") was found relevant for the bioluminescence assays, being most favorable between 1.5 and 3 h. The maximum light emission, observed after 80 min, is dependent on the analyte concentration and the exposure time (evaluated between 5 and 20 min), but not on the flow rate of naphthalene in the sampling tube, over the range of 1.8-7.4 nmol min(-1). A good linear response was obtained between 50 and 260 nmol L-1 with a limit of detection estimated in 20 nmol L-1 far below the recommended threshold limit value for naphthalene in air. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
In recent years, Mg-Ni-based metastable alloys have been attracting attention due to their large hydrogen sorption capacities, low weight, low cost, and high availability. Despite the large discharge capacity and high activity of these alloys, the accelerated degradation of the discharge capacity after only few cycles of charge and discharge is the main shortcoming against their commercial use in batteries. The addition of alloying elements showed to be an effective way of improving the electrode performance of Mg-Ni-based alloys. In the present work, the effect of Ti and Pt alloying elements on the structure and electrode performance of a binary Mg-Ni alloy was investigated. The XRD and HRTEM revealed that all the investigated alloy compositions had multi-phase nanostructures, with crystallite size in the range of 6 nm. Moreover, the investigated alloying elements demonstrated remarkable improvements of both maximum discharge capacity and cycling life. Simultaneous addition of Ti and Pd demonstrated a synergetic effect on the electrochemical properties of the alloy electrodes. Among the investigated alloys, the best electrochemical performance was obtained for the Mg(51)Ti(4)Ni(43)Pt(2) composition (in at.%), which achieved 448 mAh g(-1) of maximum discharge capacity and retained almost 66% of this capacity after 10 cycles. In contrast, the binary Mg(55)Ni(45) alloy achieved only 248 mAh g(-1) and retained 11% of this capacity after 10 cycles. (C) 2010 Elsevier By. All rights reserved.
Resumo:
Text messaging is a new form of writing, brought about by technological development in the last couple of decades. Mobile phone usage has increased rapidly worldwide and texting is now part of many people's everyday communcation. A large number of users send or receive texts which include some abbreviations and shortenings, commonly referred to as textspeak. This novel linguistic phenomenon is perceived by some with indifference and by others with aggravation. The following study examines attitudes towards this linguistic change from a gender and age perspective. The comparison between two groups show that the most conservative and least positive to change are young women. The analysis and discussion around this focuses on power, prestige and patterns.
Resumo:
The thesis belongs to the field of lexical semantics studies, associated with describing the Russian linguistic world-image. The research focuses on the universal situation of purchase and sale as reflected in the Russian lexical standard and sub-standard. The work deals also with subjects related to the sphere of social linguistics: the social stratification of the language, the structure of sub-standard, etc. The thesis is a contribution to the description of the Russian linguistic world-image as well as to the further elaboration of the conceptional analysis method. The results are applicable in teaching Russian as a foreign language, particularly in lexis and Russian culture and mentality studies.
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
Recently, two international standard organizations, ISO and OGC, have done the work of standardization for GIS. Current standardization work for providing interoperability among GIS DB focuses on the design of open interfaces. But, this work has not considered procedures and methods for designing river geospatial data. Eventually, river geospatial data has its own model. When we share the data by open interface among heterogeneous GIS DB, differences between models result in the loss of information. In this study a plan was suggested both to respond to these changes in the information envirnment and to provide a future Smart River-based river information service by understanding the current state of river geospatial data model, improving, redesigning the database. Therefore, primary and foreign key, which can distinguish attribute information and entity linkages, were redefined to increase the usability. Database construction of attribute information and entity relationship diagram have been newly redefined to redesign linkages among tables from the perspective of a river standard database. In addition, this study was undertaken to expand the current supplier-oriented operating system to a demand-oriented operating system by establishing an efficient management of river-related information and a utilization system, capable of adapting to the changes of a river management paradigm.
Resumo:
Audio coding is used to compress digital audio signals, thereby reducing the amount of bits needed to transmit or to store an audio signal. This is useful when network bandwidth or storage capacity is very limited. Audio compression algorithms are based on an encoding and decoding process. In the encoding step, the uncompressed audio signal is transformed into a coded representation, thereby compressing the audio signal. Thereafter, the coded audio signal eventually needs to be restored (e.g. for playing back) through decoding of the coded audio signal. The decoder receives the bitstream and reconverts it into an uncompressed signal. ISO-MPEG is a standard for high-quality, low bit-rate video and audio coding. The audio part of the standard is composed by algorithms for high-quality low-bit-rate audio coding, i.e. algorithms that reduce the original bit-rate, while guaranteeing high quality of the audio signal. The audio coding algorithms consists of MPEG-1 (with three different layers), MPEG-2, MPEG-2 AAC, and MPEG-4. This work presents a study of the MPEG-4 AAC audio coding algorithm. Besides, it presents the implementation of the AAC algorithm on different platforms, and comparisons among implementations. The implementations are in C language, in Assembly of Intel Pentium, in C-language using DSP processor, and in HDL. Since each implementation has its own application niche, each one is valid as a final solution. Moreover, another purpose of this work is the comparison among these implementations, considering estimated costs, execution time, and advantages and disadvantages of each one.
Resumo:
Introdução e Objetivos: O sistema nervoso central (SNC) é o um sítio freqüente de recaída na criança com leucemia linfocítica aguda (LLA). Existe evidência de que a punção lombar traumática (PLT) pode representar um risco adicional de recaída no SNC quando ocorre inoculação de blastos no liqüido céfalorraquidiano (LCR). Este estudo tem por objetivo determinar se a ocorrência da PLT ao diagnóstico afeta o prognóstico de pacientes com essa patologia. Material e Métodos: Setenta e sete pacientes com diagnóstico de LLA, tratados entre 1992 a 2002, foram incluídos na análise. Quimioterapia intratecal (QIT) foi instilada imediatamente após a PL inicial (precoce), ou na segunda PL (tardia), realizada no período de 24 a 48 horas após a realização da PL inicial. Foi feita análise da influência da PLT e do momento (precoce x tardia) de administração da QIT em relação a recaída no SNC. Resultados: Entre os 19 pacientes que apresentaram PLT ao diagnóstico e receberam QIT tardia, seis tiveram recaída isolada no SNC e dois recaída combinada em SNC e medula óssea (MO). Entre os nove pacientes que tiveram PLT e receberam QIT precoce, somente um apresentou recaída combinada em SNC e MO (P=0,20); não houve, portanto, influência estatisticamente significativa da PLT na sobrevida livre de eventos (SLE) (55% para QIT precoce x 49% para QIT tardia) (P=0,37). Entretanto, em análise estratificada, de acordo com grupos de risco, observamos que para pacientes de baixo ou médio risco o OR foi de 0,8 quando recebiam QIT tardia (P=0,99) e 0,17 quando recebiam QIT precoce (P=0,47). Por outro lado, entre pacientes de alto risco o OR para recaída foi de 21,0 para aqueles que recebiam QIT tardia (P=0,09) e 1,5 para o grupo que recebia Q IT precoce (P=0,99). Conclusão: Os resultados do presente estudo são sugestivos de que a ocorrência da PLT tem uma influência adversa no prognóstico de pacientes com LLA de alto risco de recaída. Como estes resultados são decorrentes de um estudo retrospectivo, recomenda-se que sejam confirmados em estudos prospectivos randomizados.
Resumo:
O objetivo deste artigo é verificar a influência de variáveis políticas na determinação da taxa de câmbio em quatro países latino-americanos que conviveram com elevada inflação e déficit em Transações Correntes nas décadas de setenta e oitenta. Estudos empíricos já haviam demonstrado a influência das eleições. Nenhum, porém, havia incorporado a estrutura de decisão do Executivo e Legislativo neste processo. Só foi possível incorporar o regime político (Autoritário/Democrático) e a divisão de poder no Legislativo de todos os países num modelo standard de taxa de câmbio porque utilizamos a técnica de painel. Obtivemos os seguintes resultados: países classificados como Autoritários apresentaram uma taxa de câmbio mais valorizada e Legislativos mais fragmentados apresentaram uma taxa de câmbio mais desvalorizada. Vimos este último resultado com desconfiança uma vez que, entre os países da amostra, o regime Autoritário era, em alguns casos, uma ditadura militar e o Legislativo pouco intervia nas decisões. Interagimos o regime político com fragmentação e percebemos que o efeito da classificação do regime predomina. No caso, se existir um regime Autoritário, o câmbio resultante da interação ainda será valorizado. A divisão de poder no Legislativo apenas provoca uma redução no impacto da valorização.
Resumo:
A escolha do tema ficou circunscrita ao subsetor de construção habitacional, que é caracterizado pela presença de pequenas e médias empresas. A consolidação de grandes empresas neste ramo de atividades tende a ocorrer sobretudo em conjunturas onde através da intervenção estatal são definidos programas habitacionais de grande escala que requerem das empresas participantes de maior volume de capital e o acesso a tecnologias mais sofisticadas. A importância da pequena empresa neste ramo de atividades pode ser notada atraves da experiência dos países da Comunidade Econômica Europeia, que em 1985 de um total de mais de um milhão de empresas de construção, 90% tinham até 10 empregados. Igual situação se verifica nos Estados Unidos da América onde conforme dados de 1965, também 90 % das 875.000 empresas então existentes no país empregavam menos de 10 pessoas (ONU, 1987). De uma maneira geral a importância do tema fica evidente quando se analisa o peso do setor da construção civil na economia nacional, que se situa em torno de 5% do PIB nos países industrializados, enquanto que nos países de industrialização recente, este percentual pode atingir o índice de 7% (PNU, op.cit.). Tal constatação evidencia que a indústria da construção representa o principal item na composição dos investimentos (formação bruta de capital fixo) das contas nacionais de diversos países, apresentando uma participação relativa quase sempre superior a 55% nos países industrializados e 60% nos países de industrialização recente (ONU, op.cit.). No Brasil tal importância se confirma pelo valor adicionado do setor habitacional correspondente a 2,2% do PIB, e numa visão mais abrangente, dentro dos contornos da indústria da construção civil como um todo, representando cerca de 7,3% do PIB (FIBGE, 1988). Quanto a questão da mão-de-obra, em que pese a ausência de dados no Brasil, e bastante visível sua larga influência na economia atuando como um ramo de atividades multiplicador de alocação de pessoal, principalmente a nível de utilização intensiva de mão-de-obra menos qualificada. Enquanto nos EEUU e CEE 90% das empresas constituíam-se de organizações com menos de 10 empregados, no Brasil onde há ausência de dados oficiais a respeito, na pior das hipóteses pode-se estimar que este percentual pode variar entre 50 a 70% do total de empresas, o que não deixa de ser um dado de extrema magnitude. Foram exatamente a expressão destes números, sem considerar o déficit habitacional em torno de 10 milhões de pessoas, um terço da população, moram em condições inadequadas (Exame, 1991), que influenciaram fortemente a escolha do tema da presente proposta de Tese.