992 resultados para Architecture for a Free Subjectivity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel high throughput and scalable unified architecture for the computation of the transform operations in video codecs for advanced standards is presented in this paper. This structure can be used as a hardware accelerator in modern embedded systems to efficiently compute all the two-dimensional 4 x 4 and 2 x 2 transforms of the H.264/AVC standard. Moreover, its highly flexible design and hardware efficiency allows it to be easily scaled in terms of performance and hardware cost to meet the specific requirements of any given video coding application. Experimental results obtained using a Xilinx Virtex-5 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which presents a throughput per unit of area relatively higher than other similar recently published designs targeting the H.264/AVC standard. Such results also showed that, when integrated in a multi-core embedded system, this architecture provides speedup factors of about 120x concerning pure software implementations of the transform algorithms, therefore allowing the computation, in real-time, of all the above mentioned transforms for Ultra High Definition Video (UHDV) sequences (4,320 x 7,680 @ 30 fps).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The species abundance distribution (SAD) has been a central focus of community ecology for over fifty years, and is currently the subject of widespread renewed interest. The gambin model has recently been proposed as a model that provides a superior fit to commonly preferred SAD models. It has also been argued that the model's single parameter (α) presents a potentially informative ecological diversity metric, because it summarises the shape of the SAD in a single number. Despite this potential, few empirical tests of the model have been undertaken, perhaps because the necessary methods and software for fitting the model have not existed. Here, we derive a maximum likelihood method to fit the model, and use it to undertake a comprehensive comparative analysis of the fit of the gambin model. The functions and computational code to fit the model are incorporated in a newly developed free-to-download R package (gambin). We test the gambin model using a variety of datasets and compare the fit of the gambin model to fits obtained using the Poisson lognormal, logseries and zero-sum multinomial distributions. We found that gambin almost universally provided a better fit to the data and that the fit was consistent for a variety of sample grain sizes. We demonstrate how α can be used to differentiate intelligibly between community structures of Azorean arthropods sampled in different land use types. We conclude that gambin presents a flexible model capable of fitting a wide variety of observed SAD data, while providing a useful index of SAD form in its single fitted parameter. As such, gambin has wide potential applicability in the study of SADs, and ecology more generally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de Mestrado, Ciências da Comunicação, 3 de Junho de 2015, Universidade dos Açores.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho surgiu do âmbito da Tese de Dissertação do Mestrado em Energias Sustentáveis do Instituto Superior de Engenharia do Porto, tendo o acompanhamento dos orientadores da empresa Laboratório Ecotermolab do Instituto de Soldadura e Qualidade e do Instituto Superior de Engenharia do Porto, de forma a garantir a linha traçada indo de acordo aos objectivos propostos. A presente tese abordou o estudo do impacto da influência do ar novo na climatização de edifícios, tendo como base de apoio à análise a simulação dinâmica do edifício em condições reais num programa adequado, acreditado pela norma ASHRAE 140-2004. Este trabalho pretendeu evidenciar qual o impacto da influência do ar novo na climatização de um edifício com a conjugação de vários factores, tais como, ocupação, actividades e padrões de utilização (horários), iluminação e equipamentos, estudando ainda a possibilidade do sistema funcionar em regime de “Free-Cooling”. O princípio partiu fundamentalmente por determinar até que ponto se pode climatizar recorrendo único e exclusivamente à introdução de ar novo em regime de “Free-Cooling”, através de um sistema tudo-ar de Volume de Ar Variável - VAV, sem o apoio de qualquer outro sistema de climatização auxiliar localizado no espaço, respeitando os caudais mínimos impostos pelo RSECE (Decreto-Lei 79/2006). Numa primeira fase foram identificados todos os dados relativos à determinação das cargas térmicas do edifício, tendo em conta todos os factores e contributos alusivos ao valor da carga térmica, tais como a transmissão de calor e seus constituintes, a iluminação, a ventilação, o uso de equipamentos e os níveis de ocupação. Consequentemente foram elaboradas diversas simulações dinâmicas com o recurso ao programa EnergyPlus integrado no DesignBuilder, conjugando variáveis desde as envolventes à própria arquitectura, perfis de utilização ocupacional, equipamentos e taxas de renovação de ar nos diferentes espaços do edifício em estudo. Obtiveram-se vários modelos de forma a promover um estudo comparativo e aprofundado que permitisse determinar o impacto do ar novo na climatização do edifício, perspectivando a capacidade funcional do sistema funcionar em regime de “Free-Cooling”. Deste modo, a análise e comparação dos dados obtidos permitiram chegar às seguintes conclusões: Tendo em consideração que para necessidades de arrefecimento bastante elevadas, o “Free-Cooling” diurno revelou-se pouco eficaz ou quase nulo, para o tipo de clima verificado em Portugal, pois o diferencial de temperatura existente entre o exterior e o interior não é suficiente de modo a tornar possível a remoção das cargas de forma a baixar a temperatura interior para o intervalo de conforto. Em relação ao “Free-Cooling” em horário nocturno ou pós-laboral, este revelou-se bem mais eficiente. Obtiveram-se prestações muito interessantes sobretudo durante as estações de aquecimento e meia-estação, tendo em consideração o facto de existir necessidades de arrefecimento mesmo durante a estação de aquecimento. Referente à ventilação nocturna, isto é, em períodos de madrugada e fecho do edifício, concluiu-se que tal contribui para um abaixamento do calor acumulado durante o dia nos materiais construtivos do edifício e que é libertado ou restituído posteriormente para os espaços em períodos mais tardios. De entre as seguintes variáveis, aumento de caudal de ar novo insuflado e o diferencial de temperatura existente entre o ar exterior e interior, ficou demonstrado que este último teria maior peso contributivo na remoção do calor. Por fim, é ponto assente que de um modo geral, um sistema de climatização será sempre indispensável devido a cargas internas elevadas, requisitos interiores de temperatura e humidade, sendo no entanto aconselhado o “Free- Cooling” como um opção viável a incorporar na solução de climatização, de forma a promover o arrefecimento natural, a redução do consumo energético e a introdução activa de ar novo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new high performance architecture for the computation of all the DCT operations adopted in the H.264/AVC and HEVC standards is proposed in this paper. Contrasting to other dedicated transform cores, the presented multi-standard transform architecture is supported on a completely configurable, scalable and unified structure, that is able to compute not only the forward and the inverse 8×8 and 4×4 integer DCTs and the 4×4 and 2×2 Hadamard transforms defined in the H.264/AVC standard, but also the 4×4, 8×8, 16×16 and 32×32 integer transforms adopted in HEVC. Experimental results obtained using a Xilinx Virtex-7 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which outperforms its more prominent related designs by at least 1.8 times. When integrated in a multi-core embedded system, this architecture allows the computation, in real-time, of all the transforms mentioned above for resolutions as high as the 8k Ultra High Definition Television (UHDTV) (7680×4320 @ 30fps).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Comunication in Internationa Conference with Peer Review First International Congress on Cardiovasular Technologies - CARDIOTECHNIX, Vilamoura, Portugal, 2013

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose - The study evaluates the pre- and post-training lesion localisation ability of a group of novice observers. Parallels are drawn with the performance of inexperienced radiographers taking part in preliminary clinical evaluation (PCE) and ‘red-dot’ systems, operating within radiography practice. Materials and methods - Thirty-four novice observers searched 92 images for simulated lesions. Pre-training and post-training evaluations were completed following the free-response the receiver operating characteristic (FROC) method. Training consisted of observer performance methodology, the characteristics of the simulated lesions and information on lesion frequency. Jackknife alternative FROC (JAFROC) and highest rating inferred ROC analyses were performed to evaluate performance difference on lesion-based and case-based decisions. The significance level of the test was set at 0.05 to control the probability of Type I error. Results - JAFROC analysis (F(3,33) = 26.34, p < 0.0001) and highest-rating inferred ROC analysis (F(3,33) = 10.65, p = 0.0026) revealed a statistically significant difference in lesion detection performance. The JAFROC figure-of-merit was 0.563 (95% CI 0.512,0.614) pre-training and 0.677 (95% CI 0.639,0.715) post-training. Highest rating inferred ROC figure-of-merit was 0.728 (95% CI 0.701,0.755) pre-training and 0.772 (95% CI 0.750,0.793) post-training. Conclusions - This study has demonstrated that novice observer performance can improve significantly. This study design may have relevance in the assessment of inexperienced radiographers taking part in PCE or commenting scheme for trauma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conferência: IEEE 24th International Conference on Application-Specific Systems, Architectures and Processors (ASAP)- Jun 05-07, 2013

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two chromatographic methods, gas chromatography with flow ionization detection (GC–FID) and liquid chromatography with ultraviolet detection (LC–UV), were used to determine furfuryl alcohol in several kinds of foundry resins, after application of an optimised extraction procedure. The GC method developed gave feasibility that did not depend on resin kind. Analysis by LC was suitable just for furanic resins. The presence of interference in the phenolic resins did not allow an appropriate quantification by LC. Both methods gave accurate and precise results. Recoveries were >94%; relative standard deviations were ≤7 and ≤0.3%, respectively for GC and LC methods. Good relative deviations between the two methods were found (≤3%).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agência Financiadora: Fundação para a Ciência e a Tecnologia (FCT) - PEst-OE/FIS/UI0777/2013; CERN/FP/123580/2011; PTDC/FIS-NUC/0548/2012

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Formaldehyde is a toxic component that is present in foundry resins. Its quantification is important to the characterisation of the resin (kind and degradation) as well as for the evaluation of free contaminants present in wastes generated by the foundry industry. The complexity of the matrices considered suggests the need for separative techniques. The method developed for the identification and quantification of formaldehyde in foundry resins is based on the determination of free carbonyl compounds by derivatization with 2,4-dinitrophenylhydrazine (DNPH), being adapted to the considered matrices using liquid chromatography (LC) with UV detection. Formaldehyde determinations in several foundry resins gave precise results. Mean recovery and R.S.D. were, respectively, >95 and 5%. Analyses by the hydroxylamine reference method gave comparable results. Results showed that hydroxylamine reference method is applicable just for a specific kind of resin, while the developed method has good performance for all studied resins.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes and discusses factors associated to the reemergence of yellow fever and its transmission dynamics in the states of São Paulo (Southeastern Brazil) and Rio Grande do Sul (Southern) during 2008 and 2009. The following factors have played a pivotal role for the reemergence of yellow fever in these areas: large susceptible human population; high prevalence of vectors and primary hosts (non-human primates); favorable climate conditions, especially increased rainfall; emergence of a new genetic lineage; and circulation of people and/or monkeys infected by virus. There is a need for an effective surveillance program to prevent the reemergence of yellow fever in other Brazilian states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phenol is a toxic compound present in a wide variety of foundry resins. Its quantification is important for the characterization of the resins as well as for the evaluation of free contaminants present in foundry wastes. Two chromatographic methods, liquid chromatography with ultraviolet detection (LC-UV) and gas chromatography with flame ionization detection (GC-FID), for the analysis of free phenol in several foundry resins, after a simple extraction procedure (30 min), were developed. Both chromatographic methods were suitable for the determination of phenol in the studied furanic and phenolic resins, showing good selectivity, accuracy (recovery 99–100%; relative deviations <5%), and precision (coefficients of variation <6%). The used ASTM reference method was only found to be useful in the analysis of phenolic resins, while the LC and GC methods were applicable for all the studied resins. The developed methods reduce the time of analysis from 3.5 hours to about 30 min and can readily be used in routine quality control laboratories.