934 resultados para Horizon d’attente


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A presente tese tem como principal objectivo abordar o tema da eficiência energética em edifícios, no que se refere aos sistemas de climatização. O desenvolvimento deste projecto realizou-se em torno dos consumos energéticos dos diferentes sistemas de climatização estudados (e por conseguinte da envolvente do edifício), focando o cumprimento dos requisitos térmico e energéticos das normas vigentes (RCCTE e RSECE) em Portugal, tendo como objectivo identificar os parâmetros com maior impacto e a relação tendencial entre as soluções construtivas e tecnológicas adoptadas, sempre com o horizonte de maximizar a eficiência energética e diminuir a dependência face à energia primária e consequentemente a emissão de gases que provocam o efeito de estufa. É âmbito desta tese comparar diferentes tipos de sistemas de climatização a nível energético e torná-los os mais eficientes possíveis, para que também se possam tornar monetariamente aliciantes e aumentar o rácio entre benefício/custo. Para tal, numa primeira fase foi feito um estudo térmico da envolvente do edifício, tendo sido utilizado um software de simulação energética de edifícios acreditado pela norma ASHRAE 140-2004 para se poder compreender como o edifício se comportava ao longo do ano, e introduzir algumas correcções na respectiva envolvente, para baixar as potências térmicas/eléctricas dos equipamentos do sistema de AVAC. De seguida foram estudados três sistemas possíveis de climatização para o edifício, de modo a identificar o mais eficiente numa base anual, bem como a possibilidade de combinar o uso de fontes de energia renováveis com o intuito de satisfazer ao máximo as necessidades térmicas do edifício e, ainda, de minimizar o consumo de energia de origem não renovável. Por fim, para avaliar as diferentes potencialidades de cada sistema de climatização estudado, fez-se o respectivo estudo à sua viabilidade económica. Nas considerações finais da presente tese é realizado um estudo aos benefícios que uma possível alteração da arquitectura do edifício pode trazer no aumento da iluminação natural do mesmo integrado com um controlo da iluminação artificial necessária para os diferentes espaços climatizados. Os resultados obtidos foram comparados entre si e corrigir a envolvente exterior reduz os consumos energéticos do edifício em cerca de 11%. As medidas correctivas propostas no sistema de climatização base originam uma redução energética igual a 43%. A nível ambiental, é possível a redução do número de emissões de CO2 em cerca de 72.1%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Temos vindo a assistir nos últimos anos a uma evolução no que respeita à avaliação do risco de crédito. As constantes alterações de regulamentação bancária, que resultam dos Acordos de Basileia, têm vindo a impor novas normas que condicionam a quantidade e a qualidade do risco de crédito que as Instituições de Crédito podem assumir nos seus balanços. É de grande importância as Instituições de Crédito avaliarem o risco de crédito, as garantias e o custo de capital, pois têm um impacto direto na sua gestão nomeadamente quanto à afetação de recursos e proteção contra perdas. Desta forma, pretende-se com o presente trabalho elaborar e estruturar um modelo de rating interno através de técnicas estatísticas, assim como identificar as variáveis estatisticamente relevantes no modelo considerado. Foi delineada uma metodologia de investigação mista, considerando na primeira parte do trabalho uma pesquisa qualitativa e na segunda parte uma abordagem quantitativa. Através da análise documental, fez-se uma abordagem dos conceitos teóricos e da regulamentação que serve de base ao presente trabalho. No estudo de caso, o modelo de rating interno foi desenvolvido utilizando a técnica estatística designada de regressão linear múltipla. A amostra considerada foi obtida através da base de dados SABI e é constituída por cem empresas solventes, situadas na zona de Paredes, num horizonte temporal de 2011-2013. A nossa análise baseou-se em três cenários, correspondendo cada cenário aos dados de cada ano (2011, 2012 e 2013). Para validar os pressupostos do modelo foram efetuados testes estatísticos de Durbin Watson e o teste de significância - F (ANOVA). Por fim, para obtermos a classificação de rating de cada variável foi aplicada a técnica dos percentis. Pela análise dos três cenários considerados, verificou-se que o cenário dois foi o que obteve maior coeficiente de determinação. Verificou-se ainda que as variáveis independentes, rácio de liquidez geral, grau de cobertura do ativo total pelo fundo de maneio e rácio de endividamento global são estatisticamente relevantes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION AND OBJECTIVES:Recently, three novel non-vitamin K antagonist oral anticoagulants received approval for reimbursement in Portugal for patients with non-valvular atrial fibrillation (AF). It is therefore important to evaluate the relative cost-effectiveness of these new oral anticoagulants in Portuguese AF patients. METHODS: A Markov model was used to analyze disease progression over a lifetime horizon. Relative efficacy data for stroke (ischemic and hemorrhagic), bleeding (intracranial, other major bleeding and clinically relevant non-major bleeding), myocardial infarction and treatment discontinuation were obtained by pairwise indirect comparisons between apixaban, dabigatran and rivaroxaban using warfarin as a common comparator. Data on resource use were obtained from the database of diagnosis-related groups and an expert panel. Model outputs included life years gained, quality-adjusted life years (QALYs), direct healthcare costs and incremental cost-effectiveness ratios (ICERs). RESULTS:Apixaban provided the most life years gained and QALYs. The ICERs of apixaban compared to warfarin and dabigatran were €5529/QALY and €9163/QALY, respectively. Apixaban was dominant over rivaroxaban (greater health gains and lower costs). The results were robust over a wide range of inputs in sensitivity analyses. Apixaban had a 70% probability of being cost-effective (at a threshold of €20 000/QALY) compared to all the other therapeutic options. CONCLUSIONS:Apixaban is a cost-effective alternative to warfarin and dabigatran and is dominant over rivaroxaban in AF patients from the perspective of the Portuguese national healthcare system. These conclusions are based on indirect comparisons, but despite this limitation, the information is useful for healthcare decision-makers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O Azoto (N): da ciência para a sociedade é um projecto de comunicação de ciência que tem por objecNvo consciencializar os jovens para as ameaças que o azoto (N) em excesso traz para a humanidade. Pode ser dividido em duas partes. Uma, de invesNgação, sobre a análise de resultados de uma consulta pública realizada entre professores, usando o método qualitaNvo do focus group, para compreender a sua sensibilidade e propostas de solução para minimizar o excesso de N no ambiente. Os resultados obNdos foram instrumentais para o desenvolvimento da segunda parte. Esta segunda parte é uma proposta de projecto a submeter ao Horizon 2020, no âm-­‐ bito da “Science with and for Society “. Nela se propõe uma abordagem educaNva trans-­‐disciplinar, conseguida através da interacção entre docentes do secundário, e do ensino superior, associação de pais e organizações cívicas não governamentais, com vista à consciencialização dos jovens para as ameaças do N em excesso no meio ambiente, fazendo o enquadramento cien@fico e fornecendo abordagens tecnológi-­‐ cas. A inovação desta proposta baseia-­‐se: (i) no acompanhamento e desenvolvimen-­‐ to profissional dos docentes do secundário, (ii) na moNvação dos estudantes a de-­‐ senvolver o seu próprio estudo e pesquisa com a tutoria dos docentes, da escola e do ensino superior, e (iii) no desenvolvimento de capacidades de comunicação dos jo-­‐ vens para exercer uma cidadania acNva em prol da minimização das ameaças do N.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyzes the in-, and out-of sample, predictability of the stock market returns from Eurozone’s banking sectors, arising from bank-specific ratios and macroeconomic variables, using panel estimation techniques. In order to do that, I set an unbalanced panel of 116 banks returns, from April, 1991, to March, 2013, to constitute equal-weighted country-sorted portfolios representative of the Austrian, Belgian, Finish, French, German, Greek, Irish, Italian, Portuguese and Spanish banking sectors. I find that both earnings per share (EPS) and the ratio of total loans to total assets have in-sample predictive power over the portfolios’ monthly returns whereas, regarding the cross-section of annual returns, only EPS retain significant explanatory power. Nevertheless, the sign associated with the impact of EPS is contrarian to the results of past literature. When looking at inter-yearly horizon returns, I document in-sample predictive power arising from the ratios of provisions to net interest income, and non-interest income to net income. Regarding the out-of-sample performance of the proposed models, I find that these would only beat the portfolios’ historical mean on the month following the disclosure of year-end financial statements. Still, the evidence found is not statistically significant. Finally, in a last attempt to find significant evidence of predictability of monthly and annual returns, I use Fama and French 3-Factor and Carhart models to describe the cross-section of returns. Although in-sample the factors can significantly track Eurozone’s banking sectors’ stock market returns, they do not beat the portfolios’ historical mean when forecasting returns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An infinite-horizon discrete time model with multiple size-class structures using a transition matrix is built to assess optimal harvesting schedules in the context of Non-Industrial Private Forest (NIPF) owners. Three model specifications accounting for forest income, financial return on an asset and amenity valuations are considered. Numerical simulations suggest uneven-aged forest management where a rational forest owner adapts her or his forest policy by influencing the regeneration of trees or adjusting consumption dynamics depending on subjective time preference and market return rate dynamics on the financial asset. Moreover she or he does not value significantly non-market benefits captured by amenity valuations relatively to forest income.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several papers document idiosyncratic volatility is time-varying and many attempts have been made to reveal whether idiosyncratic risk is priced. This research studies behavior of idiosyncratic volatility around information release dates and also its relation with return after public announcement. The results indicate that when a company discloses specific information to the market, firm’s specific volatility level shifts and short-horizon event-induced volatility vary significantly however, the category to which the announcement belongs is not important in magnitude of change. This event-induced volatility is not small in size and should not be downplayed in event studies. Moreover, this study shows stocks with higher contemporaneous realized idiosyncratic volatility earn lower return after public announcement consistent with “divergence of opinion hypothesis”. While no significant relation is found between EGARCH estimated idiosyncratic volatility and return and also between one-month lagged idiosyncratic volatility and return presumably due to significant jump around public announcement both may provide some signals regarding future idiosyncratic volatility through their correlations with contemporaneous realized idiosyncratic volatility. Finally, the study show that positive relation between return and idiosyncratic volatility based on under-diversification is inadequate to explain all different scenarios and this negative relation after public announcement may provide a useful trading rule.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fragment 3 : chapitre 17. Le bandeau supérieur montre le double horizon figuré par deux lions assis, ainsi qu'un phénix et la momie du défunt sur son lit funéraire.Fragment 2 : chapitre 17. Le bandeau supérieur montre les vases canopes, le chat attrapant un serpent et Nout voutée au dessus d'un lion.Fragment 1 : Colonne 1 + vignette : chapitre 18. La vignette montre trois divinités assises.Col. 2 + vignette (très fragmentaire) : chapitre 18. On distingue sur ce qui reste de la vignette le défunt debout probablement en train de faire une adoration.Fragment 4 :Col. 1 + vignette : chapitre 18. La vignette montre trois divinités assises.Col. 2 + vignette : idem.Col. 3 + vignette : idem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The question of the meaning, methods and philosophical manifestations of history is currently rife with contention. The problem that I will address in an exposition of the thought of Wilhelm Dilthey and Martin Heidegger, centers around the intersubjectivity of an historical world. Specifically, there are two interconnected issues. First, since all knowledge occurs to a person from within his or her historical age how can any person in any age make truth claims? In order to answer this concern we must understand the essence and role of history. Yet how can we come to an individual understanding ofwhat history is when the meanings that we use are themselves historically enveloped? But can we, we who are well aware of the knowledge that archaeology has dredged up from old texts or even from 'living' monuments of past ages, really neglect to notice these artifacts that exist within and enrich our world? Charges of wilful blindness would arise if any attempt were made to suggest that certain things of our world did not come down to us from the past. Thus it appears more important 2 to determine what this 'past' is and therefore how history operates than to simply derail the possibility for historical understanding. Wilhelm Dilthey, the great German historicist from the 19th century, did not question the existence of historical artifacts as from the past, but in treating knowledge as one such artifact placed the onus on knowledge to show itself as true, or meaningful, in light ofthe fact that other historical periods relied on different facts and generated different truths or meanings. The problem for him was not just determining what the role of history is, but moreover to discover how knowledge could make any claim as true knowledge. As he stated, there is a problem of "historical anarchy"!' Martin Heidegger picked up these two strands of Dilthey's thought and wanted to answer the problem of truth and meaning in order to solve the problem of historicism. This problem underscored, perhaps for the first time, that societal presuppositions about the past and present oftheir era are not immutable. Penetrating to the core of the raison d'etre of the age was an historical reflection about the past which was now conceived as separated both temporally and attitudinally from the present. But further than this, Heidegger's focus on asking the question of the meaning of Being meant that history must be ontologically explicated not merely ontically treated. Heidegger hopes to remove barriers to a genuine ontology by II 1 3 including history into an assessment ofprevious philosophical systems. He does this in order that the question of Being be more fully explicated, which necessarily for him includes the question of the Being of history. One approach to the question ofwhat history is, given the information that we get from historical knowledge, is whether such knowledge can be formalized into a science. Additionally, we can approach the question of what the essence and role of history is by revealing its underlying characteristics, that is, by focussing on historicality. Thus we will begin with an expository look at Dilthey's conception of history and historicality. We will then explore these issues first in Heidegger's Being and Time, then in the third chapter his middle and later works. Finally, we shall examine how Heidegger's conception may reflect a development in the conception of historicality over Dilthey's historicism, and what such a conception means for a contemporary historical understanding. The problem of existing in a common world which is perceived only individually has been philosophically addressed in many forms. Escaping a pure subjectivist interpretation of 'reality' has occupied Western thinkers not only in order to discover metaphysical truths, but also to provide a foundation for politics and ethics. Many thinkers accept a solipsistic view as inevitable and reject attempts at justifying truth in an intersubjective world. The problem ofhistoricality raises similar problems. We 4 -. - - - - exist in a common historical age, presumably, yet are only aware ofthe historicity of the age through our own individual thoughts. Thus the question arises, do we actually exist within a common history or do we merely individually interpret this as communal? What is the reality of history, individual or communal? Dilthey answers this question by asserting a 'reality' to the historical age thus overcoming solipsism by encasing individual human experience within the historical horizon of the age. This however does nothing to address the epistemological concern over the discoverablity of truth. Heidegger, on the other hand, rejects a metaphysical construel of history and seeks to ground history first within the ontology ofDasein, and second, within the so called "sending" of Being. Thus there can be no solipsism for Heidegger because Dasein's Being is necessarily "cohistorical", Being-with-Others, and furthermore, this historical-Being-in-the-worldwith- Others is the horizon of Being over which truth can appear. Heidegger's solution to the problem of solipsism appears to satisfy that the world is not just a subjective idealist creation and also that one need not appeal to any universal measures of truth or presumed eternal verities. Thus in elucidating Heidegger's notion of history I will also confront the issues ofDasein's Being-alongside-things as well as the Being of Dasein as Being-in-the-world so that Dasein's historicality is explicated vis-a-vis the "sending of Being" (die Schicken des S eins).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effec s of relative water level changes in Lake Ontario were detected in the ysical, chemical and biological characteristics of the sediments of the Fifteen, Sixteen and Twenty Mile Creek lagoonal complexes. Regional environmental changes have occurred resulting in the following sequence of sediments in the three lagoons and marsh. From the base up they are; (I) Till,(2) Pink Clay, (3) Bottom Sand, (4) Gyttja, (5) Orange Sandy Silt, (6) Brown Clay and (7) Gray Clay. The till was only encountered in the marsh and channel; however, it is presumed to occur throughout the entire area. The presence of diatoms and sponge spicules, the vertical and ongitudinal uniformity of the sediment and the stratigr ic position of the Pink Clay indicate that it has a glacial or post-glacial lacustrine origin. Overl ng the Pink Clay or Till is a clayey, silty sand to gravel. The downstream fining and unsorted nature of this material indicate that it has a fluvial/deltaic origin. Water levels began rising in the lagoon 3,250 years ago resulting in the deposition of the Gyttja, a brown, organic-rich silty clay probably deposited in a shallow, stagnant environment as shown by the presence of pyrite in the organic material and relatively high proportions of benthic diatoms and grass pollen. Increase in the rate of deposition of the Gyttja on Twenty Mile Creek and a decrease in the same unit on Sixteen Mile Creek is possibly the result of a capture of the Sixteen Mile Creek by the Twenty Mile Creek. The rise in lake level responsible for the onset and transgression of this III unit may have been produced by isostatic rebound; however, the deposition also corresponds closely to a drop in the level of Lake Huron and increased flow through the lower lakes. The o ange Sandy Silt, present only in the marsh, appears to be a buried soil horizon as shown by oxidized roots, and may be the upland equivalant to the Gyttja. Additional deepening resulted in the deposition of Brown Clay, a unit which only occurs at the lakeward end of the three lagoons. The decrease in grass pollen and the relatively high proportion of pelagic diatoms are evidence for this. The deepening may be the result of isostatic rebound; however, the onset of its deposition at 1640 years B.P. is synchronous in the three lagoons and corresponds to the end of the subAtlantic climatic episode. The effects of the climatic change in southern Ontario is uncertain. Average deposition rates of the Brown Clay are similar to those in the upper Gyttja on Sixteen Mile Creek; however, Twenty Mile Creek shows lower rates of the Brown Clay than those in the upper Gyttja. The Gray Clay covers the present bottom of the three lagoons and also occurs in the marsh It is inter1aminated wi sand in the channels. Increases in the rates of deposi ion, high concentrations of Ca and Zn, an Ambrosia rise, and an increase in bioturbation possibly due to the activities of the carp, indicate th this unit is a recent deposit resulting from the activities of man.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Landscape geochemical investigations were conducted upon portions of a natural uniform landscape in southern Norway. This consisted of sampling both soil profile samples and spruce tree twigs for the analysis of twelve chemical elements. These elements were cobalt, copper, nickel, lead, zinc, manganese, magnesium, iron, calcium, sodium, potassium and aluminum which were determined by atomic absorption analysis on standardized extraction techniques for both organic and inorganic materials. Two "landscape traverses" were chosen for a comparative study of the effects of varying landscape parameters upon the trace element distribution patterns throughout the landscape traverses. The object of this study was to test this method of investigation and the concept of an ideal uniform landscape under Norwegian conditions. A "control traverse" was established to represent uniform landscape conditions typical of the study area and was used to determine "normal" or average trace element distribution patterns. A "signal traverse" was selected nearby over an area of lead mineralization where the depth to bedrock is very small. The signal traverse provided an area of similar landscape conditions to those of the control traverse with significant differences in the bedrock configuration and composition. This study was also to determine the effect of the bedrock mineralization upon the distribution patterns of the twelve chemical elements within the major components of the two landscape traverses (i.e. soil profiles and tree branches). The lead distribution within the soils of the signal traverse showed localized accumulations of lead within the overburden with maximum values occurring within the organic A horizon of soil profile #10. Above average concentrations of lead were common within the signal traverse, however, the other elements studied were not significantly different from those averages determined throughout the soils of the control traverse. The spruce twig samples did not have corresponding accumulations of lead near the soil lead anomaly. This is attributable to the very localized nature of the lead dispersion pattern within the soils. This approach to the study of the geochemistry of a natural landscape was effective in establishing: a) average or "normal" trace element distribution patterns b) local variations in the landscape morphology and c) the effect of unusually high lead concentrations upon the geochemistry of the landscape (i.e. within the soil profiles and tree branches). This type of study provides the basis for further more intensive studies and serves only as a first approximation of the behaviour of elements within a natural landscape.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research focuses on generating aesthetically pleasing images in virtual environments using the particle swarm optimization (PSO) algorithm. The PSO is a stochastic population based search algorithm that is inspired by the flocking behavior of birds. In this research, we implement swarms of cameras flying through a virtual world in search of an image that is aesthetically pleasing. Virtual world exploration using particle swarm optimization is considered to be a new research area and is of interest to both the scientific and artistic communities. Aesthetic rules such as rule of thirds, subject matter, colour similarity and horizon line are all analyzed together as a multi-objective problem to analyze and solve with rendered images. A new multi-objective PSO algorithm, the sum of ranks PSO, is introduced. It is empirically compared to other single-objective and multi-objective swarm algorithms. An advantage of the sum of ranks PSO is that it is useful for solving high-dimensional problems within the context of this research. Throughout many experiments, we show that our approach is capable of automatically producing images satisfying a variety of supplied aesthetic criteria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the past 20 years, researchers have applied the Kalman filter to the modeling and forecasting the term structure of interest rates. Despite its impressive performance in in-sample fitting yield curves, little research has focused on the out-of-sample forecast of yield curves using the Kalman filter. The goal of this thesis is to develop a unified dynamic model based on Diebold and Li (2006) and Nelson and Siegel’s (1987) three-factor model, and estimate this dynamic model using the Kalman filter. We compare both in-sample and out-of-sample performance of our dynamic methods with various other models in the literature. We find that our dynamic model dominates existing models in medium- and long-horizon yield curve predictions. However, the dynamic model should be used with caution when forecasting short maturity yields

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La causalité au sens de Granger est habituellement définie par la prévisibilité d'un vecteur de variables par un autre une période à l'avance. Récemment, Lutkepohl (1990) a proposé de définir la non-causalité entre deux variables (ou vecteurs) par la non-prévisibilité à tous les délais dans le futur. Lorsqu'on considère plus de deux vecteurs (ie. lorsque l'ensemble d'information contient les variables auxiliaires), ces deux notions ne sont pas équivalentes. Dans ce texte, nous généralisons d'abord les notions antérieures de causalités en considérant la causalité à un horizon donné h arbitraire, fini ou infini. Ensuite, nous dérivons des conditions nécessaires et suffisantes de non-causalité entre deux vecteurs de variables (à l'intérieur d'un plus grand vecteur) jusqu'à un horizon donné h. Les modèles considérés incluent les autoregressions vectorielles, possiblement d'ordre infini, et les modèles ARIMA multivariés. En particulier, nous donnons des conditions de séparabilité et de rang pour la non-causalité jusqu'à un horizon h, lesquelles sont relativement simples à vérifier.