876 resultados para Meteorological problems
Resumo:
Systematic errors can have a significant effect on GPS observable. In medium and long baselines the major systematic error source are the ionosphere and troposphere refraction and the GPS satellites orbit errors. But, in short baselines, the multipath is more relevant. These errors degrade the accuracy of the positioning accomplished by GPS. So, this is a critical problem for high precision GPS positioning applications. Recently, a method has been suggested to mitigate these errors: the semiparametric model and the penalised least squares technique. It uses a natural cubic spline to model the errors as a function which varies smoothly in time. The systematic errors functions, ambiguities and station coordinates, are estimated simultaneously. As a result, the ambiguities and the station coordinates are estimated with better reliability and accuracy than the conventional least square method.
Resumo:
BACKGROUND: Intraocular gas bubbles expand as patients move up to higher altitude. This may cause an acute intraocular pressure (IOP) rise with associated vascular obstructions and visual loss. MATERIALS AND METHODS: Two pseudophakic patients underwent a pars plana vitrectomy and 23% SF6 gas tamponade for a pseudophakic retinal detachment. During the immediate post-operative phase, the patients travelled daily up to their domicile, which was situated approximately 600 m higher than the level where they had been operated on. These travels were always without any pain or visual loss. However 1 week after surgery both patients developed severe ocular pain, and one patient had complete temporary loss of vision after ascending to altitude levels, which had previously presented no problem. Both episodes occurred in parallel with a change in barometric pressure. RESULTS: Treatment with acetazolamide reduced the increased IOP to normal levels, and visual acuity recovered. CONCLUSIONS: Although the post-operative size of an intraocular gas bubble decreases progressively over time, problems with bubble expansion may still occur even at a late stage if meteorological factors, that may increase the bubble size, change.
Resumo:
The Gauss–Newton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an “inner” direct or iterative process. In comparison with Newton’s method and its variants, the algorithm is attractive because it does not require the evaluation of second-order derivatives in the Hessian of the objective function. In practice the exact Gauss–Newton method is too expensive to apply operationally in meteorological forecasting, and various approximations are made in order to reduce computational costs and to solve the problems in real time. Here we investigate the effects on the convergence of the Gauss–Newton method of two types of approximation used commonly in data assimilation. First, we examine “truncated” Gauss–Newton methods where the inner linear least squares problem is not solved exactly, and second, we examine “perturbed” Gauss–Newton methods where the true linearized inner problem is approximated by a simplified, or perturbed, linear least squares problem. We give conditions ensuring that the truncated and perturbed Gauss–Newton methods converge and also derive rates of convergence for the iterations. The results are illustrated by a simple numerical example. A practical application to the problem of data assimilation in a typical meteorological system is presented.
Resumo:
In the Eady model, where the meridional potential vorticity (PV) gradient is zero, perturbation energy growth can be partitioned cleanly into three mechanisms: (i) shear instability, (ii) resonance, and (iii) the Orr mechanism. Shear instability involves two-way interaction between Rossby edge waves on the ground and lid, resonance occurs as interior PV anomalies excite the edge waves, and the Orr mechanism involves only interior PV anomalies. These mechanisms have distinct implications for the structural and temporal linear evolution of perturbations. Here, a new framework is developed in which the same mechanisms can be distinguished for growth on basic states with nonzero interior PV gradients. It is further shown that the evolution from quite general initial conditions can be accurately described (peak error in perturbation total energy typically less than 10%) by a reduced system that involves only three Rossby wave components. Two of these are counterpropagating Rossby waves—that is, generalizations of the Rossby edge waves when the interior PV gradient is nonzero—whereas the other component depends on the structure of the initial condition and its PV is advected passively with the shear flow. In the cases considered, the three-component model outperforms approximate solutions based on truncating a modal or singular vector basis.
Resumo:
Several previous studies have attempted to assess the sublimation depth-scales of ice particles from clouds into clear air. Upon examining the sublimation depth-scales in the Met Office Unified Model (MetUM), it was found that the MetUM has evaporation depth-scales 2–3 times larger than radar observations. Similar results can be seen in the European Centre for Medium-Range Weather Forecasts (ECMWF), Regional Atmospheric Climate Model (RACMO) and Météo-France models. In this study, we use radar simulation (converting model variables into radar observations) and one-dimensional explicit microphysics numerical modelling to test and diagnose the cause of the deep sublimation depth-scales in the forecast model. The MetUM data and parametrization scheme are used to predict terminal velocity, which can be compared with the observed Doppler velocity. This can then be used to test the hypothesis as to why the sublimation depth-scale is too large within the MetUM. Turbulence could lead to dry air entrainment and higher evaporation rates; particle density may be wrong, particle capacitance may be too high and lead to incorrect evaporation rates or the humidity within the sublimating layer may be incorrectly represented. We show that the most likely cause of deep sublimation zones is an incorrect representation of model humidity in the layer. This is tested further by using a one-dimensional explicit microphysics model, which tests the sensitivity of ice sublimation to key atmospheric variables and is capable of including sonde and radar measurements to simulate real cases. Results suggest that the MetUM grid resolution at ice cloud altitudes is not sufficient enough to maintain the sharp drop in humidity that is observed in the sublimation zone.
Resumo:
With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.
Resumo:
To bridge the gaps between traditional mesoscale modelling and microscale modelling, the National Center for Atmospheric Research, in collaboration with other agencies and research groups, has developed an integrated urban modelling system coupled to the weather research and forecasting (WRF) model as a community tool to address urban environmental issues. The core of this WRF/urban modelling system consists of the following: (1) three methods with different degrees of freedom to parameterize urban surface processes, ranging from a simple bulk parameterization to a sophisticated multi-layer urban canopy model with an indoor–outdoor exchange sub-model that directly interacts with the atmospheric boundary layer, (2) coupling to fine-scale computational fluid dynamic Reynolds-averaged Navier–Stokes and Large-Eddy simulation models for transport and dispersion (T&D) applications, (3) procedures to incorporate high-resolution urban land use, building morphology, and anthropogenic heating data using the National Urban Database and Access Portal Tool (NUDAPT), and (4) an urbanized high-resolution land data assimilation system. This paper provides an overview of this modelling system; addresses the daunting challenges of initializing the coupled WRF/urban model and of specifying the potentially vast number of parameters required to execute the WRF/urban model; explores the model sensitivity to these urban parameters; and evaluates the ability of WRF/urban to capture urban heat islands, complex boundary-layer structures aloft, and urban plume T&D for several major metropolitan regions. Recent applications of this modelling system illustrate its promising utility, as a regional climate-modelling tool, to investigate impacts of future urbanization on regional meteorological conditions and on air quality under future climate change scenarios. Copyright © 2010 Royal Meteorological Society
Resumo:
The secular record of annual mean temperatures of Bremen shows that inhomogeneities - especially caused by station transfers - lead to serious problems concerning the interpretation of climatic trends or fluctuations. Especially two transfers of the meteorological observing station in Bremen within our century - 1935/36 and 1978 - caused significant inhomogeneities, well documented by parallel measurements for several years. Obviously the stagnation of the temperature level of the original data set is a result of these transfers. The homogenized record version reveals a significant warming trend of about 1 Kelvin within the last century.
Resumo:
Climate change is one of the biggest environmental problems of the 21st century. The most sensitive indicators of the effects of the climatic changes are phenological processes of the biota. The effects of climate change which were observed the earliest are the remarkable changes in the phenology (i.e. the timing of the phenophases) of the plants and animals, which have been systematically monitored later. In our research we searched for the answer: which meteorological factors show the strongest statistical relationships with phenological phenomena based on some chosen plant and insect species (in case of which large phenological databases are available). Our study was based on two large databases: one of them is the Lepidoptera database of the Hungarian Plant Protection and Forestry Light Trap Network, the other one is the Geophytes Phenology Database of the Botanical Garden of Eötvös Loránd University. In the case of butterflies, statistically defined phenological dates were determined based on the daily collection data, while in the case of plants, observation data on blooming were available. The same meteorological indicators were applied for both groups in our study. On the basis of the data series, analyses of correlation were carried out and a new indicator, the so-called G index was introduced, summing up the number of correlations which were found to be significant on the different levels of significance. In our present study we compare the significant meteorological factors and analyse the differences based on the correlation data on plants and butterflies. Data on butterflies are much more varied regarding the effectiveness of the meteorological factors.
Resumo:
Substantial complexity has been introduced into treatment regimens for patients with human immunodeficiency virus (HIV) infection. Many drug-related problems (DRPs) are detected in these patients, such as low adherence, therapeutic inefficacy, and safety issues. We evaluated the impact of pharmacist interventions on CD4+ T-lymphocyte count, HIV viral load, and DRPs in patients with HIV infection. In this 18-month prospective controlled study, 90 outpatients were selected by convenience sampling from the Hospital Dia-University of Campinas Teaching Hospital (Brazil). Forty-five patients comprised the pharmacist intervention group and 45 the control group; all patients had HIV infection with or without acquired immunodeficiency syndrome. Pharmaceutical appointments were conducted based on the Pharmacotherapy Workup method, although DRPs and pharmacist intervention classifications were modified for applicability to institutional service limitations and research requirements. Pharmacist interventions were performed immediately after detection of DRPs. The main outcome measures were DRPs, CD4+ T-lymphocyte count, and HIV viral load. After pharmacist intervention, DRPs decreased from 5.2 (95% confidence interval [CI] =4.1-6.2) to 4.2 (95% CI =3.3-5.1) per patient (P=0.043). A total of 122 pharmacist interventions were proposed, with an average of 2.7 interventions per patient. All the pharmacist interventions were accepted by physicians, and among patients, the interventions were well accepted during the appointments, but compliance with the interventions was not measured. A statistically significant increase in CD4+ T-lymphocyte count in the intervention group was found (260.7 cells/mm(3) [95% CI =175.8-345.6] to 312.0 cells/mm(3) [95% CI =23.5-40.6], P=0.015), which was not observed in the control group. There was no statistical difference between the groups regarding HIV viral load. This study suggests that pharmacist interventions in patients with HIV infection can cause an increase in CD4+ T-lymphocyte counts and a decrease in DRPs, demonstrating the importance of an optimal pharmaceutical care plan.
Resumo:
In this study, the transmission-line modeling (TLM) applied to bio-thermal problems was improved by incorporating several novel computational techniques, which include application of graded meshes which resulted in 9 times faster in computational time and uses only a fraction (16%) of the computational resources used by regular meshes in analyzing heat flow through heterogeneous media. Graded meshes, unlike regular meshes, allow heat sources to be modeled in all segments of the mesh. A new boundary condition that considers thermal properties and thus resulting in a more realistic modeling of complex problems is introduced. Also, a new way of calculating an error parameter is introduced. The calculated temperatures between nodes were compared against the results obtained from the literature and agreed within less than 1% difference. It is reasonable, therefore, to conclude that the improved TLM model described herein has great potential in heat transfer of biological systems.
Resumo:
The scope of this study is to identify the prevalence of access to information about how to prevent oral problems among schoolchildren in the public school network, as well as the factors associated with such access. This is a cross-sectional and analytical study conducted among 12-year-old schoolchildren in a Brazilian municipality with a large population. The examinations were performed by 24 trained dentists and calibrated with the aid of 24 recorders. Data collection occurred in 36 public schools selected from the 89 public schools of the city. Descriptive, univariate and multiple analyses were conducted. Of the 2510 schoolchildren included in the study, 2211 reported having received information about how to prevent oral problems. Access to such information was greater among those who used private dental services; and lower among those who used the service for treatment, who evaluated the service as regular or bad/awful. The latter use toothbrush only or toothbrush and tongue scrubbing as a means of oral hygiene and who reported not being satisfied with the appearance of their teeth. The conclusion drawn is that the majority of schoolchildren had access to information about how to prevent oral problems, though access was associated with the characteristics of health services, health behavior and outcomes.
Resumo:
Efforts presented by the scientific community in recent years towards the development of numerous green chemical processes and wastewater treatment technologies are presented and discussed. In the light of these approaches, environmentally friendly technologies, as well as the key role played by the well-known advanced oxidation processes, are discussed, giving special attention to the ones comprising ozone applications. Fundamentals and applied aspects dealing with ozone technology and its application are also presented.
Resumo:
PURPOSE: Compare parents' reports of youth problems (PRYP) with adolescent problems self-reports (APSR) pre/post behavioral treatment of nocturnal enuresis (NE) based on the use of a urine alarm. MATERIALS AND METHODS: Adolescents (N = 19) with mono-symptomatic (primary or secondary) nocturnal enuresis group treatment for 40 weeks. Discharge criterion was established as 8 weeks with consecutive dry nights. PRYP and APSR were scored by the Child Behavior Checklist (CBCL) and Youth Self-Report (YSR). RESULTS: Pre-treatment data: 1) Higher number of clinical cases based on parent report than on self-report for Internalizing Problems (IP) (13/19 vs. 4/19), Externalizing Problems (EP) (7/19 vs. 5/19) and Total Problem (TP) (11/19 vs. 5/19); 2) Mean PRYP scores for IP (60.8) and TP (61) were within the deviant range (T score ≥ 60); while mean PRYP scores for EP (57.4) and mean APSR scores (IP = 52.4, EP = 49.5, TP = 52.4) were within the normal range. Difference between PRYP' and APSR' scores was significant. Post treatment data: 1) Discharge for majority of the participants (16/19); 2) Reduction in the number of clinical cases on parental evaluation: 9/19 adolescents remained within clinical range for IP, 2/19 for EP, and 7/19 for TP. 3) All post-treatment mean scores were within the normal range; the difference between pre and post evaluation scores was significant for PRYP. CONCLUSIONS: The behavioral treatment based on the use of urine alarm is effective for adolescents with mono-symptomatic (primary and secondary) nocturnal enuresis. The study favors the hypothesis that enuresis is a cause, not a consequence, of other behavioral problems.