994 resultados para problem complexity
Resumo:
Recent observations from type Ia Supernovae and from cosmic microwave background (CMB) anisotropies have revealed that most of the matter of the Universe interacts in a repulsive manner, composing the so-called dark energy constituent of the Universe. Determining the properties of dark energy is one of the most important tasks of modern cosmology and this is the main motivation for this work. The analysis of cosmic gravitational waves (GW) represents, besides the CMB temperature and polarization anisotropies, an additional approach in the determination of parameters that may constrain the dark energy models and their consistence. In recent work, a generalized Chaplygin gas model was considered in a flat universe and the corresponding spectrum of gravitational waves was obtained. In the present work we have added a massless gas component to that model and the new spectrum has been compared to the previous one. The Chaplygin gas is also used to simulate a L-CDM model by means of a particular combination of parameters so that the Chaplygin gas and the L-CDM models can be easily distinguished in the theoretical scenarios here established. We find that the models are strongly degenerated in the range of frequencies studied. This degeneracy is in part expected since the models must converge to each other when some particular combinations of parameters are considered.
Resumo:
Esta tese se propôs investigar a lógica inferencial das ações e suas significações em situações que mobilizam as noções de composição probabilística e acaso, bem como o papel dos modelos de significação no funcionamento cognitivo de adultos. Participaram 12 estudantes adultos jovens da classe popular, voluntários, de ambos os sexos, de um curso técnico integrado ao Ensino Médio da Educação de Jovens e Adultos. Foram realizados três encontros, individualmente, com registro em áudio e planilha eletrônica, utilizando-se dois jogos, o Likid Gaz e o Lucky Cassino, do software Missão Cognição (Haddad-Zubel, Pinkas & Pécaut, 2006), e o jogo Soma dos Dados (Silva, Rossetti & Cristo, 2012). Os procedimentos da tarefa foram adaptados de Silva e Frezza (2011): 1) apresentação do jogo; 2) execução do jogo; 3) entrevista semiestruturada; 4) aplicação de três situações-problema com intervenção segundo o Método Clínico; 5) nova partida do jogo; e 6) realização de outras duas situações-problema sem intervenção do Método Clínico. Elaboraram-se níveis de análise heurística, compreensão dos jogos e modelos de significação a partir da identificação de particularidades de procedimentos e significações nos jogos. O primeiro estudo examinou as implicações dos modelos de significação e representações prévias no pensamento do adulto, considerando que o sujeito organiza suas representações ou esquemas prévios relativos a um objeto na forma de modelos de significação em função do grau de complexidade e novidade da tarefa e de sua estrutura lógico matemática, que evoluem por meio do processo de equilibração; para o que precisa da demanda a significar esse aspecto da 13 realidade. O segundo estudo investigou a noção de combinação deduzível evidenciada no jogo Likid Gaz, identificando o papel dos modelos de significação na escolha dos procedimentos, implicando na rejeição de condutas de sistematização ou enumeração. Houve predominância dos níveis iniciais de análise heurística do jogo. O terceiro estudo examinou a noção de probabilidade observada no jogo Lucky Cassino, no qual a maioria dos participantes teve um nível de compreensão do jogo intermediário, com maior diversidade de modelos de significação em relação aos outros jogos, embora com predominância dos mais elementares. A síntese das noções de combinação, probabilidade e acaso foi explorada no quarto estudo pelo jogo Soma dos Dados (Silva, Rossetti & Cristo, 2012), identificando-se que uma limitação para adequada compreensão das ligações imbricadas nessas noções é a implicação significante – se aleatório A, então indeterminado D (notação A D), com construção de pseudonecessidades e pseudo-obrigações ou mesmo necessidades locais, generalizadas inapropriadamente. A resistência ou obstáculos do objeto deveria provocar perturbações, mas a estrutura cognitiva, o ambiente social e os modelos culturais, e a afetividade podem interferir nesse processo.
Resumo:
The central goal of this paper is thinking about the Brazilian military power and its linking to the international ambitions of the country in the 21st century. After a comparative analysis to other BRICs and with a historical one about Brazil's strategic irrelevance, we aim to establish what the minimum military capacity Brazil would need in order to meet the country's latest international interests. Similarly, it will be discussed if the National Strategy of Defense, approved in 2008, and the recent strategic agreements signed with France represent one more step toward this minimum military capacity.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
This paper presents a new generalized solution for DC bus capacitors voltage balancing in back-to-back m level diode-clamped multilevel converters connecting AC networks. The solution is based on the DC bus average power flow and exploits the switching configuration redundancies. The proposed balancing solution is particularized for the back-to-back multilevel structure with m=5 levels. This back-to-back converter is studied working with bidirectional power flow, connecting an induction machine to the power grid.
Resumo:
Motion compensated frame interpolation (MCFI) is one of the most efficient solutions to generate side information (SI) in the context of distributed video coding. However, it creates SI with rather significant motion compensated errors for some frame regions while rather small for some other regions depending on the video content. In this paper, a low complexity Infra mode selection algorithm is proposed to select the most 'critical' blocks in the WZ frame and help the decoder with some reliable data for those blocks. For each block, the novel coding mode selection algorithm estimates the encoding rate for the Intra based and WZ coding modes and determines the best coding mode while maintaining a low encoder complexity. The proposed solution is evaluated in terms of rate-distortion performance with improvements up to 1.2 dB regarding a WZ coding mode only solution.
Resumo:
Formaldehyde was the first air pollutant, which already in the 1970s emerged as a specifically non-industrial indoor air quality problem. Yet formaldehyde remained an indoor air quality issue and the formaldehyde level in residential indoor air is among the highest of any indoor air contaminant. Formaldehyde concentrations in 4 different indoor settings (schools, office buildings, new dwellings and occupied dwellings) in Portugal were measured using Photo Ionization Detection (PID) equipment (11,7 eV lamps). All the settings presented results higher than the reference value proposed by Portuguese legislation. Furthermore, occupied dwellings showed 3 units with results above the reference. We could conclude that formaldehyde presence is a reality in monitored indoor settings. Concentration levels are higher than the Portuguese reference value for indoor settings and these can indicate health problems for occupants.
Resumo:
Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.
Resumo:
Financial literature and financial industry use often zero coupon yield curves as input for testing hypotheses, pricing assets or managing risk. They assume this provided data as accurate. We analyse implications of the methodology and of the sample selection criteria used to estimate the zero coupon bond yield term structure on the resulting volatility of spot rates with different maturities. We obtain the volatility term structure using historical volatilities and Egarch volatilities. As input for these volatilities we consider our own spot rates estimation from GovPX bond data and three popular interest rates data sets: from the Federal Reserve Board, from the US Department of the Treasury (H15), and from Bloomberg. We find strong evidence that the resulting zero coupon bond yield volatility estimates as well as the correlation coefficients among spot and forward rates depend significantly on the data set. We observe relevant differences in economic terms when volatilities are used to price derivatives.
Resumo:
Exposure to certain fungi can cause human illness. Fungi cause adverse human health effects through three specific mechanisms: generation of a harmful immune response (e.g., allergy or hypersensitivity pneumonitis); direct infection by the fungal organism; by toxic-irritant effects from mold byproducts, such as mycotoxins. In Portugal there is an increasingly industry of large facilities that produce whole chickens for domestic consumption and only few investigations have reported on fungal contamination of the poultry litter. The material used for poultry litter is varied but normally can be constitute by: pine shavings; sawdust of eucalyptus; other types of wood; peanut; coffee; sugar cane; straw; hay; grass; paper processed. Litter is one of the most contributive factors to fungal contamination in poultries. Spreading litter is one of the tasks that normally involve higher exposure of the poultry workers to dust, fungi and their metabolites, such as VOC’s and mycotoxins. After being used and removed from poultries, litter is ploughed into agricultural soils, being this practice potentially dangerous for the soil environment, as well for both humans and animals. The goal of this study was to characterize litter’s fungal contamination and also to report the incidence of keratinophilic and toxigenic fungi.
Resumo:
Projecto apresentado ao Instituto Superior de Contabilidade e Administração do Porto para a obtenção do Grau de Mestre em Assessoria de Administração
Resumo:
We discuss existence and multiplicity of positive solutions of the Dirichlet problem for the quasilinear ordinary differential equation-(u' / root 1 - u'(2))' = f(t, u). Depending on the behaviour of f = f(t, s) near s = 0, we prove the existence of either one, or two, or three, or infinitely many positive solutions. In general, the positivity of f is not required. All results are obtained by reduction to an equivalent non-singular problem to which variational or topological methods apply in a classical fashion.
Resumo:
Although numerous studies have been conducted on microbial contaminants associated with various stages related to poultry and meat products processing, only a few reported on fungal contamination of poultry litter. The goals of this study were to (1) characterize litter fungal contamination and (2) report the incidence of keratinophilic and toxigenic fungi presence. Seven fresh and 14 aged litter samples were collected from 7 poultry farms. In addition, 27 air samples of 25 litters were also collected through impaction method, and after laboratory processing and incubation of collected samples, quantitative colony-forming units (CFU/m3) and qualitative results were obtained. Twelve different fungal species were detected in fresh litter and Penicillium was the most frequent genus found (59.9%), followed by Alternaria (17.8%), Cladosporium (7.1%), and Aspergillus (5.7%). With respect to aged litter, 19 different fungal species were detected, with Penicillium sp. the most frequently isolated (42.3%), followed by Scopulariopsis sp. (38.3%), Trichosporon sp. (8.8%), and Aspergillus sp. (5.5%). A significant positive correlation was found between litter fungal contamination (CFU/g) and air fungal contamination (CFU/m3). Litter fungal quantification and species identification have important implications in the evaluation of potential adverse health risks to exposed workers and animals. Spreading of poultry litter in agricultural fields is a potential public health concern, since keratinophilic (Scopulariopsis and Fusarium genus) as well as toxigenic fungi (Aspergillus, Fusarium, and Penicillium genus) were isolated.
Resumo:
In this paper we present a user-centered interface for a scheduling system. The purpose of this interface is to provide graphical and interactive ways of defining a scheduling problem. To create such user interface an evaluation-centered user interaction development method was adopted: the star life cycle. The created prototype comprises the Task Module and the Scheduling Problem Module. The first one allows users to define a sequence of operations, i.e., a task. The second one enables a scheduling problem definition, which consists in a set of tasks. Both modules are equipped with a set of real time validations to assure the correct definition of the necessary data input for the scheduling module of the system. The usability evaluation allowed us to measure the ease of interaction and observe the different forms of interaction provided by each participant, namely the reactions to the real time validation mechanism.
Resumo:
Many of the most common human functions such as temporal and non-monotonic reasoning have not yet been fully mapped in developed systems, even though some theoretical breakthroughs have already been accomplished. This is mainly due to the inherent computational complexity of the theoretical approaches. In the particular area of fault diagnosis in power systems however, some systems which tried to solve the problem, have been deployed using methodologies such as production rule based expert systems, neural networks, recognition of chronicles, fuzzy expert systems, etc. SPARSE (from the Portuguese acronym, which means expert system for incident analysis and restoration support) was one of the developed systems and, in the sequence of its development, came the need to cope with incomplete and/or incorrect information as well as the traditional problems for power systems fault diagnosis based on SCADA (supervisory control and data acquisition) information retrieval, namely real-time operation, huge amounts of information, etc. This paper presents an architecture for a decision support system, which can solve the presented problems, using a symbiosis of the event calculus and the default reasoning rule based system paradigms, insuring soft real-time operation with incomplete, incorrect or domain incoherent information handling ability. A prototype implementation of this system is already at work in the control centre of the Portuguese Transmission Network.