940 resultados para global nonhydrostatic model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

No presente trabalho, mostram-se equações de estimativa da irradiação solar global (R G), por meio do modelo de Angstrom, com partições sazonal e mensal para a região de Cascavel - PR. Os dados experimentais foram cedidos pelo IAPAR, coletados na sua estação meteorológica localizada na COODETEC/Cascavel - PR, no período de 1983 a 1998. Dos 16 anos de dados, 12 anos foram utilizados para cálculo dos coeficientes (a e b) e quatro anos para a validação das equações. Os coeficientes de determinação encontrados foram superiores a 80% para as duas partições. O mínimo da R G é superestimado e o máximo é subestimado quando comparados com o mínimo e o máximo para dados reais, sendo esses encontrados no solstício de inverno e equinócio de primavera, respectivamente. A variação sazonal e mensal do coeficiente a foi menor (0,16 a 0,19 e 0,14 a 0,21) e do coeficiente b maior (0,34 a 0,43 e 0,32 a 0,44). As maiores variações dos erros médios diários ocorreram no equinócio de primavera (-19,45% a 27,28%) e as menores no equinócio de outono (-11,32% a 10,61%). O ajuste mais eficaz das equações foi encontrado para a partição mensal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The third primary production algorithm round robin (PPARR3) compares output from 24 models that estimate depth-integrated primary production from satellite measurements of ocean color, as well as seven general circulation models (GCMs) coupled with ecosystem or biogeochemical models. Here we compare the global primary production fields corresponding to eight months of 1998 and 1999 as estimated from common input fields of photosynthetically-available radiation (PAR), sea-surface temperature (SST), mixed-layer depth, and chlorophyll concentration. We also quantify the sensitivity of the ocean-color-based models to perturbations in their input variables. The pair-wise correlation between ocean-color models was used to cluster them into groups or related output, which reflect the regions and environmental conditions under which they respond differently. The groups do not follow model complexity with regards to wavelength or depth dependence, though they are related to the manner in which temperature is used to parameterize photosynthesis. Global average PP varies by a factor of two between models. The models diverged the most for the Southern Ocean, SST under 10 degrees C, and chlorophyll concentration exceeding 1 mg Chlm(-3). Based on the conditions under which the model results diverge most, we conclude that current ocean-color-based models are challenged by high-nutrient low-chlorophyll conditions, and extreme temperatures or chlorophyll concentrations. The GCM-based models predict comparable primary production to those based on ocean color: they estimate higher values in the Southern Ocean, at low SST, and in the equatorial band, while they estimate lower values in eutrophic regions (probably because the area of high chlorophyll concentrations is smaller in the GCMs). Further progress in primary production modeling requires improved understanding of the effect of temperature on photosynthesis and better parameterization of the maximum photosynthetic rate. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extensive systematizations of theoretical and experimental nuclear densities and of optical potential strengths extracted from heavy-ion elastic scattering data analyses at low and intermediate energies are presented. The energy dependence of the nuclear potential is accounted for within a model based on the nonlocal nature of the interaction. The systematics indicates that the heavy-ion nuclear potential can be described in a simple global way through a double-folding shape, which basically depends only on the density of nucleons of the partners in the collision. The possibility of extracting information about the nucleon-nucleon interaction from the heavy-ion potential is investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new model for the representation of electrodes' filaments of hot-cathode fluorescent lamps, during preheating processes based on the injection of currents with constant root mean square (rms) values. The main improvement obtained with this model is the prediction of the R-h/R-c ratio during the preheating process, as a function of the preheating time and of the rms current injected in the electrodes. Using the proposed model, it is possible to obtain an estimate of the time interval and the current that should be provided by the electronic ballast, in order to ensure a suitable preheating process. is estimate of time and current can be used as input data in the design of electronic ballasts with programmed lamp start, permitting the prediction of the R-h/R-c ratio during the initial steps of the design (theoretical analysis and digital simulation). Therefore, the use of the proposed model permits to reduce the necessity of several empirical adjustments in the prototype, in order to set the operation of electronic ballasts during the preheating process. This fact reduces time and costs associated to the global design procedure of electronic ballasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Depth-integrated primary productivity (PP) estimates obtained from satellite ocean color-based models (SatPPMs) and those generated from biogeochemical ocean general circulation models (BCGCMs) represent a key resource for biogeochemical and ecological studies at global as well as regional scales. Calibration and validation of these PP models are not straightforward, however, and comparative studies show large differences between model estimates. The goal of this paper is to compare PP estimates obtained from 30 different models (21 SatPPMs and 9 BOGCMs) to a tropical Pacific PP database consisting of similar to 1000 C-14 measurements spanning more than a decade (1983-1996). Primary findings include: skill varied significantly between models, but performance was not a function of model complexity or type (i.e. SatPPM vs. BOGCM); nearly all models underestimated the observed variance of PR specifically yielding too few low PP (< 0.2 g Cm-2 d(-1)) values; more than half of the total root-mean-squared model-data differences associated with the satellite-based PP models might be accounted for by uncertainties in the input variables and/or the PP data; and the tropical Pacific database captures a broad scale shift from low biomassnormalized productivity in the 1980s to higher biomass-normalized productivity in the 1990s, which was not successfully captured by any of the models. This latter result suggests that interdecadal and global changes will be a significant challenge for both SatPPMs and BOGCMs. Finally, average root-mean-squared differences between in situ PP data on the equator at 140 degrees W and PP estimates from the satellite-based productivity models were 58% lower than analogous values computed in a previous PP model comparison 6 years ago. The success of these types of comparison exercises is illustrated by the continual modification and improvement of the participating models and the resulting increase in model skill. (C) 2008 Elsevier BY. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nearly half of the earth's photosynthetically fixed carbon derives from the oceans. To determine global and region specific rates, we rely on models that estimate marine net primary productivity (NPP) thus it is essential that these models are evaluated to determine their accuracy. Here we assessed the skill of 21 ocean color models by comparing their estimates of depth-integrated NPP to 1156 in situ C-14 measurements encompassing ten marine regions including the Sargasso Sea, pelagic North Atlantic, coastal Northeast Atlantic, Black Sea, Mediterranean Sea, Arabian Sea, subtropical North Pacific, Ross Sea, West Antarctic Peninsula, and the Antarctic Polar Frontal Zone. Average model skill, as determined by root-mean square difference calculations, was lowest in the Black and Mediterranean Seas, highest in the pelagic North Atlantic and the Antarctic Polar Frontal Zone, and intermediate in the other six regions. The maximum fraction of model skill that may be attributable to uncertainties in both the input variables and in situ NPP measurements was nearly 72%. on average, the simplest depth/wavelength integrated models performed no worse than the more complex depth/wavelength resolved models. Ocean color models were not highly challenged in extreme conditions of surface chlorophyll-a and sea surface temperature, nor in high-nitrate low-chlorophyll waters. Water column depth was the primary influence on ocean color model performance such that average skill was significantly higher at depths greater than 250 m, suggesting that ocean color models are more challenged in Case-2 waters (coastal) than in Case-1 (pelagic) waters. Given that in situ chlorophyll-a data was used as input data, algorithm improvement is required to eliminate the poor performance of ocean color NPP models in Case-2 waters that are close to coastlines. Finally, ocean color chlorophyll-a algorithms are challenged by optically complex Case-2 waters, thus using satellite-derived chlorophyll-a to estimate NPP in coastal areas would likely further reduce the skill of ocean color models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A branch and bound algorithm is proposed to solve the H2-norm model reduction problem and the H2-norm controller reduction problem, with conditions assuring convergence to the global optimum in finite time. The lower and upper bounds used in the optimization procedure are obtained through linear matrix inequalities formulations. Examples illustrate the results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulations of overshooting, tropical deep convection using a Cloud Resolving Model with bulk microphysics are presented in order to examine the effect on the water content of the TTL (Tropical Tropopause Layer) and lower stratosphere. This case study is a subproject of the HIBISCUS (Impact of tropical convection on the upper troposphere and lower stratosphere at global scale) campaign, which took place in Bauru, Brazil (22° S, 49° W), from the end of January to early March 2004. Comparisons between 2-D and 3-D simulations suggest that the use of 3-D dynamics is vital in order to capture the mixing between the overshoot and the stratospheric air, which caused evaporation of ice and resulted in an overall moistening of the lower stratosphere. In contrast, a dehydrating effect was predicted by the 2-D simulation due to the extra time, allowed by the lack of mixing, for the ice transported to the region to precipitate out of the overshoot air. Three different strengths of convection are simulated in 3-D by applying successively lower heating rates (used to initiate the convection) in the boundary layer. Moistening is produced in all cases, indicating that convective vigour is not a factor in whether moistening or dehydration is produced by clouds that penetrate the tropopause, since the weakest case only just did so. An estimate of the moistening effect of these clouds on an air parcel traversing a convective region is made based on the domain mean simulated moistening and the frequency of convective events observed by the IPMet (Instituto de Pesquisas Meteorológicas, Universidade Estadual Paulista) radar (S-band type at 2.8 Ghz) to have the same 10 dBZ echo top height as those simulated. These suggest a fairly significant mean moistening of 0.26, 0.13 and 0.05 ppmv in the strongest, medium and weakest cases, respectively, for heights between 16 and 17 km. Since the cold point and WMO (World Meteorological Organization) tropopause in this region lies at ∼ 15.9 km, this is likely to represent direct stratospheric moistening. Much more moistening is predicted for the 15-16 km height range with increases of 0.85-2.8 ppmv predicted. However, it would be required that this air is lofted through the tropopause via the Brewer Dobson circulation in order for it to have a stratospheric effect. Whether this is likely is uncertain and, in addition, the dehydration of air as it passes through the cold trap and the number of times that trajectories sample convective regions needs to be taken into account to gauge the overall stratospheric effect. Nevertheless, the results suggest a potentially significant role for convection in determining the stratospheric water content. Sensitivity tests exploring the impact of increased aerosol numbers in the boundary layer suggest that a corresponding rise in cloud droplet numbers at cloud base would increase the number concentrations of the ice crystals transported to the TTL, which had the effect of reducing the fall speeds of the ice and causing a ∼13% rise in the mean vapour increase in both the 15-16 and 16-17 km height ranges, respectively, when compared to the control case. Increases in the total water were much larger, being 34% and 132% higher for the same height ranges, but it is unclear whether the extra ice will be able to evaporate before precipitating from the region. These results suggest a possible impact of natural and anthropogenic aerosols on how convective clouds affect stratospheric moisture levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The GPS observables are subject to several errors. Among them, the systematic ones have great impact, because they degrade the accuracy of the accomplished positioning. These errors are those related, mainly, to GPS satellites orbits, multipath and atmospheric effects. Lately, a method has been suggested to mitigate these errors: the semiparametric model and the penalised least squares technique (PLS). In this method, the errors are modeled as functions varying smoothly in time. It is like to change the stochastic model, in which the errors functions are incorporated, the results obtained are similar to those in which the functional model is changed. As a result, the ambiguities and the station coordinates are estimated with better reliability and accuracy than the conventional least square method (CLS). In general, the solution requires a shorter data interval, minimizing costs. The method performance was analyzed in two experiments, using data from single frequency receivers. The first one was accomplished with a short baseline, where the main error was the multipath. In the second experiment, a baseline of 102 km was used. In this case, the predominant errors were due to the ionosphere and troposphere refraction. In the first experiment, using 5 minutes of data collection, the largest coordinates discrepancies in relation to the ground truth reached 1.6 cm and 3.3 cm in h coordinate for PLS and the CLS, respectively, in the second one, also using 5 minutes of data, the discrepancies were 27 cm in h for the PLS and 175 cm in h for the CLS. In these tests, it was also possible to verify a considerable improvement in the ambiguities resolution using the PLS in relation to the CLS, with a reduced data collection time interval. © Springer-Verlag Berlin Heidelberg 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Incluye Bibliografía

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Includes bibliography