970 resultados para PBL tutorial search term
Resumo:
Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.
Resumo:
This paper presents an interior point method for the long-term generation scheduling of large-scale hydrothermal systems. The problem is formulated as a nonlinear programming one due to the nonlinear representation of hydropower production and thermal fuel cost functions. Sparsity exploitation techniques and an heuristic procedure for computing the interior point method search directions have been developed. Numerical tests in case studies with systems of different dimensions and inflow scenarios have been carried out in order to evaluate the proposed method. Three systems were tested, with the largest being the Brazilian hydropower system with 74 hydro plants distributed in several cascades. Results show that the proposed method is an efficient and robust tool for solving the long-term generation scheduling problem.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
In this paper a method for solving the Short Term Transmission Network Expansion Planning (STTNEP) problem is presented. The STTNEP is a very complex mixed integer nonlinear programming problem that presents a combinatorial explosion in the search space. In this work we present a constructive heuristic algorithm to find a solution of the STTNEP of excellent quality. In each step of the algorithm a sensitivity index is used to add a circuit (transmission line or transformer) to the system. This sensitivity index is obtained solving the STTNEP problem considering as a continuous variable the number of circuits to be added (relaxed problem). The relaxed problem is a large and complex nonlinear programming and was solved through an interior points method that uses a combination of the multiple predictor corrector and multiple centrality corrections methods, both belonging to the family of higher order interior points method (HOIPM). Tests were carried out using a modified Carver system and the results presented show the good performance of both the constructive heuristic algorithm to solve the STTNEP problem and the HOIPM used in each step.
Resumo:
In this paper, a method for solving the short term transmission network expansion planning problem is presented. This is a very complex mixed integer nonlinear programming problem that presents a combinatorial explosion in the search space. In order to And a solution of excellent quality for this problem, a constructive heuristic algorithm is presented in this paper. In each step of the algorithm, a sensitivity index is used to add a circuit (transmission line or transformer) or a capacitor bank (fixed or variable) to the system. This sensitivity index is obtained solving the problem considering the numbers of circuits and capacitors banks to be added (relaxed problem), as continuous variables. The relaxed problem is a large and complex nonlinear programming and was solved through a higher order interior point method. The paper shows results of several tests that were performed using three well-known electric energy systems in order to show the possibility and the advantages of using the AC model. ©2007 IEEE.
Resumo:
The high active and reactive power level demanded by the distribution systems, the growth of consuming centers, and the long lines of the distribution systems result in voltage variations in the busses compromising the quality of energy supplied. To ensure the energy quality supplied in the distribution system short-term planning, some devices and actions are used to implement an effective control of voltage, reactive power, and power factor of the network. Among these devices and actions are the voltage regulators (VRs) and capacitor banks (CBs), as well as exchanging the conductors sizes of distribution lines. This paper presents a methodology based on the Non-Dominated Sorting Genetic Algorithm (NSGA-II) for optimized allocation of VRs, CBs, and exchange of conductors in radial distribution systems. The Multiobjective Genetic Algorithm (MGA) is aided by an inference process developed using fuzzy logic, which applies specialized knowledge to achieve the reduction of the search space for the allocation of CBs and VRs.
Resumo:
One way to organize knowledge and make its search and retrieval easier is to create a structural representation divided by hierarchically related topics. Once this structure is built, it is necessary to find labels for each of the obtained clusters. In many cases the labels must be built using all the terms in the documents of the collection. This paper presents the SeCLAR method, which explores the use of association rules in the selection of good candidates for labels of hierarchical document clusters. The purpose of this method is to select a subset of terms by exploring the relationship among the terms of each document. Thus, these candidates can be processed by a classical method to generate the labels. An experimental study demonstrates the potential of the proposed approach to improve the precision and recall of labels obtained by classical methods only considering the terms which are potentially more discriminative. © 2012 - IOS Press and the authors. All rights reserved.
Resumo:
Neste tutorial apresentamos uma revisão da deconvolução de Euler que consiste de três partes. Na primeira parte, recordamos o papel da clássica formulação da deconvolução de Euler 2D e 3D como um método para localizar automaticamente fontes de campos potenciais anômalas e apontamos as dificuldades desta formulação: a presença de uma indesejável nuvem de soluções, o critério empírico usado para determinar o índice estrutural (um parâmetro relacionado com a natureza da fonte anômala), a exeqüibilidade da aplicação da deconvolução de Euler a levantamentos magnéticos terrestres, e a determinação do mergulho e do contraste de susceptibilidade magnética de contatos geológicos (ou o produto do contraste de susceptibilidade e a espessura quando aplicado a dique fino). Na segunda parte, apresentamos as recentes melhorias objetivando minimizar algumas dificuldades apresentadas na primeira parte deste tutorial. Entre estas melhorias incluem-se: i) a seleção das soluções essencialmente associadas com observações apresentando alta razão sinal-ruído; ii) o uso da correlação entre a estimativa do nível de base da anomalia e a própria anomalia observada ou a combinação da deconvolução de Euler com o sinal analítico para determinação do índice estrutural; iii) a combinação dos resultados de (i) e (ii), permitindo estimar o índice estrutural independentemente do número de soluções; desta forma, um menor número de observações (tal como em levantamentos terrestres) pode ser usado; iv) a introdução de equações adicionais independentes da equação de Euler que permitem estimar o mergulho e o contraste de susceptibilidade das fontes magnéticas 2D. Na terceira parte apresentaremos um prognóstico sobre futuros desenvolvimentos a curto e médio prazo envolvendo a deconvolução de Euler. As principais perspectivas são: i) novos ataques aos problemas selecionados na segunda parte deste tutorial; ii) desenvolvimento de métodos que permitam considerar interferências de fontes localizadas ao lado ou acima da fonte principal, e iii) uso das estimativas de localização da fonte anômala produzidas pela deconvolução de Euler como vínculos em métodos de inversão para obter a delineação das fontes em um ambiente computacional amigável.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The purpose of this paper is to propose a definition of the term “green/environmental innovation”, based on a systematic literature review. Design/methodology/approach– The literature review conducted in this research was based on papers published in ISI Web of Science and Scopus databases. Findings– Environmental innovations are organizational implementations and changes focusing on the environment, with implications for companies’ products, manufacturing processes and marketing, with different degrees of novelty. They can be merely incremental improvements that intensify the performance of something that already exists, or radical ones that promote something completely unprecedented, where the main objective is to reduce the company's environmental impacts. In addition, environmental innovation has a bilateral relationship with the level of proactive environmental management adopted by companies. Increasing of environmental innovation tends to come up against many barriers.
Resumo:
Abstract Background The search for enriched (aka over-represented or enhanced) ontology terms in a list of genes obtained from microarray experiments is becoming a standard procedure for a system-level analysis. This procedure tries to summarize the information focussing on classification designs such as Gene Ontology, KEGG pathways, and so on, instead of focussing on individual genes. Although it is well known in statistics that association and significance are distinct concepts, only the former approach has been used to deal with the ontology term enrichment problem. Results BayGO implements a Bayesian approach to search for enriched terms from microarray data. The R source-code is freely available at http://blasto.iq.usp.br/~tkoide/BayGO in three versions: Linux, which can be easily incorporated into pre-existent pipelines; Windows, to be controlled interactively; and as a web-tool. The software was validated using a bacterial heat shock response dataset, since this stress triggers known system-level responses. Conclusion The Bayesian model accounts for the fact that, eventually, not all the genes from a given category are observable in microarray data due to low intensity signal, quality filters, genes that were not spotted and so on. Moreover, BayGO allows one to measure the statistical association between generic ontology terms and differential expression, instead of working only with the common significance analysis.
Resumo:
The existence of Multiple Myeloma Stem cells (MMSCs)is supposed to be one of the major causes of MM drug-resistance. However, very little is known about the molecular characteristics of MMSCs, even if some studies suggested that these cells resembles the memory B cells. In order to molecularly characterize MMSCs, we isolated the 138+138- population. For each cell fraction we performed a VDJ rearrangement analysis. The complete set of aberrations were performed by SNP Array 6.0 and HG-U133 Plus 2.0 microarray analyses (Affymetrix). The VDJ rearrangement analyses confirmed the clonal relationship between the 138+ clone and the immature clone. Both BM and PBL 138+ clones showed exactly the same genomic macroalterations. In the BM and PBL 138-19+27+ cell fractions several micro-alterations (range: 1-350 Kb) unique of the memory B cells clone were highlighted. Any micro-alterations detected were located out of any genomic variants region and are presumably associated to the MM pathogenesis, as confirmed by the presence of KRAS, WWOX and XIAP genes among the amplified regions. To get insight into the biology of the clonotypic B cell population, we compared the gene expression profile of 8 MM B cells samples 5 donor B cells vs, thus showing a differential expression of 11480 probes (p-value: <0,05). Among the self-renewal mechanisms, we observed the down-regulation of Hedgehog pathway and the iperactivation of Notch and Wnt signaling. Moreover, these immature cells showed a particular phenotype correlated to resistance to proteasome inhibitors (IRE1α-XBP1: -18.0; -19.96. P<0,05). Data suggested that the MM 138+ clone might resume the end of the complex process of myelomagenesis, whereas the memory B cells have some intriguing micro-alterations and a specific transcriptional program, supporting the idea that these post germinal center cells might be involved in the transforming event that originate and sustain the neoplastic clone.
Resumo:
Although the term 'reflex sympathetic dystrophy' has been replaced by 'complex regional pain syndrome' (CRPS) type I, there remains a widespread presumption that the sympathetic nervous system is actively involved in mediating chronic neuropathic pain ["sympathetically maintained pain" (SMP)], even in the absence of detectable neuropathophysiology.
Resumo:
Purpose of review: We aimed to review literature on the efficacy and tolerability of psychosocial and psychopharmacological interventions in youth with early-onset schizophrenia spectrum disorders (EOS). A rationale for pragmatic psychopharmacology in EOS, including dosing, switching and adverse effect monitoring and management, is provided. Recent findings: Three randomized controlled trials (RCTs) over the last 8 years demonstrated benefits of psychosocial interventions (i.e. psychoeducation, cognitive remediation, cognitive behavioural therapy) for EOS without clear advantages of one psychosocial treatment over another. Six large, placebo-controlled, short-term RCTs over the last 4 years demonstrated that aripiprazole, olanzapine, paliperidone, quetiapine and risperidone, but not ziprasidone, were superior to placebo. Except for clozapine's superiority in treatment-refractory EOS, efficacy appeared similar across studied first-generation and second-generation antipsychotics, but tolerability varied greatly across individual agents. Summary: Antipsychotics are efficacious in the treatment of EOS. Given the lack of efficacy differences between antipsychotics (except for clozapine for treatment-refractory EOS), we propose that tolerability considerations need to guide choice of antipsychotics. Further and longer-term efficacy and effectiveness studies are urgently needed that should also explore pharmacologic and nonpharmacologic augmentation strategies.
Resumo:
BACKGROUND: Neonates in a neonatal intensive care unit are exposed to a high number of painful procedures. Since repeated and sustained pain can have consequences for the neurological and behaviour-oriented development of the newborn, the greatest attention needs to be paid to systematic pain management in neonatology. Non-pharmacological treatment methods are being increasingly discussed with regard to pain prevention and relief either alone or in combination with pharmacological treatment. AIMS: To identify effective non-pharmacological interventions with regard to procedural pain in neonates. METHODS: A literature search was conducted via the MedLine, CINAHL, Cochrane Library databases and complemented by a handsearch. The literature search covered the period from 1984 to 2004. Data were extracted according to pre-defined criteria by two independent reviewers and methodological quality was assessed. RESULTS: 13 randomised controlled studies and two meta-analyses were taken into consideration with regard to the question of current nursing practice of non-pharmacological pain management methods. The selected interventions were "non-nutritive sucking", "music", "swaddling", "positioning", "olfactory and multisensorial stimulation", "kangaroo care" and "maternal touch". There is evidence that the methods of "non-nutritive sucking", "swaddling" and "facilitated tucking" do have a pain-alleviating effect on neonates. CONCLUSIONS: Some of the non-pharmacological interventions have an evident favourable effect on pulse rate, respiration and oxygen saturation, on the reduction of motor activity, and on the excitation states after invasive measures. However, unambiguous evidence of this still remains to be presented. Further research should emphasise the use of validated pain assessment instruments for the evaluation of the pain-alleviating effect of non-pharmacological interventions.