912 resultados para Numerical Algorithms and Problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Persian Gulf (PG) is a semi-enclosed shallow sea which is connected to open ocean through the Strait of Hormuz. Thermocline as a suddenly decrease of temperature in subsurface layer in water column leading to stratification happens in the PG seasonally. The forcing comprise tide, river inflow, solar radiation, evaporation, northwesterly wind and water exchange with the Oman Sea that influence on this process. In this research, analysis of the field data and a numerical (Princeton Ocean Model, POM) study on the summer thermocline development in the PG are presented. The Mt. Mitchell cruise 1992 salinity and temperature observations show that the thermocline is effectively removed due to strong wind mixing and lower solar radiation in winter but is gradually formed and developed during spring and summer; in fact as a result of an increase in vertical convection through the water in winter, vertical gradient of temperature is decreased and thermocline is effectively removed. Thermocline development that evolves from east to west is studied using numerical simulation and some existing observations. Results show that as the northwesterly wind in winter, at summer transition period, weakens the fresher inflow from Oman Sea, solar radiation increases in this time interval; such these factors have been caused the thermocline to be formed and developed from winter to summer even over the northwestern part of the PG. The model results show that for the more realistic monthly averaged wind experiments the thermocline develops as is indicated by summer observations. The formation of thermocline also seems to decrease the dissolved oxygen in water column due to lack of mixing as a result of induced stratification. Over most of PG the temperature difference between surface and subsurface increases exponentially from March until May. Similar variations for salinity differences are also predicted, although with smaller values than observed. Indeed thermocline development happens more rapidly in the Persian Gulf from spring to summer. Vertical difference of temperature increases to 9 centigrade degrees in some parts of the case study zone from surface to bottom in summer. Correlation coefficients of temperature and salinity between the model results and measurements have been obtained 0.85 and 0.8 respectively. The rate of thermcline development was found to be between 0.1 to 0.2 meter per day in the Persian Gulf during the 6 months from winter to early summer. Also it is resulted from the used model that turbulence kinetic energy increases in the northwestern part of the PG from winter to early summer that could be due to increase in internal waves activities and stability intensified through water column during this time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the a posteriori error analysis and hp-adaptation strategies for hp-version interior penalty discontinuous Galerkin methods for second-order partial differential equations with nonnegative characteristic form on anisotropically refined computational meshes with anisotropically enriched elemental polynomial degrees. In particular, we exploit duality based hp-error estimates for linear target functionals of the solution and design and implement the corresponding adaptive algorithms to ensure reliable and efficient control of the error in the prescribed functional to within a given tolerance. This involves exploiting both local isotropic and anisotropic mesh refinement and isotropic and anisotropic polynomial degree enrichment. The superiority of the proposed algorithm in comparison with standard hp-isotropic mesh refinement algorithms and an h-anisotropic/p-isotropic adaptive procedure is illustrated by a series of numerical experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, numerical simulation of the Caspian Sea circulation was performed using COHERENS three-dimensional numerical model and field data. The COHERENS three-dimensional model and FVCOM were performed under the effect of the wind driven force, and then the simulation results obtained were compared. Simulation modeling was performed at the Caspian Sea. Its horizontal grid size is approximately equal to 5 Km and 30 sigma levels were considered. The numerical simulation results indicate that the winds' driven-forces and temperature gradient are the most important driving force factors of the Caspian circulation pattern. One of the effects of wind-driven currents was the upwelling phenomenon that was formed in the eastern shores of the Caspian Sea in the summer. The simulation results also indicate that this phenomenon occurred at a depth less than 40 meters, and the vertical velocity in July and August was 10 meters and 7 meters respectively. During the upwelling phenomenon period the temperatures on the east coast compared to the west coast were about 5°C lower. In autumn and winter, the warm waters moved from the south east coast to the north and the cold waters moved from the west coast of the central Caspian toward the south. In the subsurface and deep layers, these movements were much more structured and caused strengthening of the anti-clockwise circulation in the area, especially in the central area of Caspian. The obtained results of the two models COHERENS and FVCOM performed under wind driven-force show a high coordination of the two models, and so the wind current circulation pattern for both models is almost identical.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a pre-processing mesh re-distribution algorithm based upon harmonic maps employed in conjunction with discontinuous Galerkin approximations of advection-diffusion-reaction problems. Extensive two-dimensional numerical experiments with different choices of monitor functions, including monitor functions derived from goal-oriented a posteriori error indicators are presented. The examples presented clearly demonstrate the capabilities and the benefits of combining our pre-processing mesh movement algorithm with both uniform, as well as, adaptive isotropic and anisotropic mesh refinement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Civil e Ambiental, 2015.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existe abundante evidencia de que los niños pequeños son capaces de desarrollar un conocimiento matemático y que las destrezas aritméticas de estos niños son predictores de su desempeño académico futuro. También existe un acuerdo común de que la calidad de la educación matemática inicial tiene una importante influencia en el aprendizaje posterior de los niños. En Ecuador hay escasos estudios sobre las competencias matemáticas tempranas de los niños y sobre su enseñanza. Por ello, se inició un estudio para (1) evaluar las competencias numéricas de los niños de pre-escolar y kindergarten (primero de básica) que asisten a una escuela pública de Cuenca, con el objetivo de analizar críticamente su pensamiento y razonamiento numérico; y (2) examinar las prácticas y creencias de los profesores con relación a la enseñanza de la matemática y a las competencias matemáticas de los niños. La aplicación del Test de Conocimiento Numérico (Griffin, 2005) demostró que la mayoría de los niños participantes no habían desarrollado habilidades numéricas básicas. Adicionalmente, los profesores expresaron una fuerte creencia de que los niños pequeños no son capaces de tener un pensamiento matemático. Como consecuencia, las actividades matemáticas que realizan los niños y profesores son desarrolladas de manera insuficiente. Las implicaciones científicas y prácticas de estos resultados son discutidas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phylogenetic inference consist in the search of an evolutionary tree to explain the best way possible genealogical relationships of a set of species. Phylogenetic analysis has a large number of applications in areas such as biology, ecology, paleontology, etc. There are several criterias which has been defined in order to infer phylogenies, among which are the maximum parsimony and maximum likelihood. The first one tries to find the phylogenetic tree that minimizes the number of evolutionary steps needed to describe the evolutionary history among species, while the second tries to find the tree that has the highest probability of produce the observed data according to an evolutionary model. The search of a phylogenetic tree can be formulated as a multi-objective optimization problem, which aims to find trees which satisfy simultaneously (and as much as possible) both criteria of parsimony and likelihood. Due to the fact that these criteria are different there won't be a single optimal solution (a single tree), but a set of compromise solutions. The solutions of this set are called "Pareto Optimal". To find this solutions, evolutionary algorithms are being used with success nowadays.This algorithms are a family of techniques, which aren’t exact, inspired by the process of natural selection. They usually find great quality solutions in order to resolve convoluted optimization problems. The way this algorithms works is based on the handling of a set of trial solutions (trees in the phylogeny case) using operators, some of them exchanges information between solutions, simulating DNA crossing, and others apply aleatory modifications, simulating a mutation. The result of this algorithms is an approximation to the set of the “Pareto Optimal” which can be shown in a graph with in order that the expert in the problem (the biologist when we talk about inference) can choose the solution of the commitment which produces the higher interest. In the case of optimization multi-objective applied to phylogenetic inference, there is open source software tool, called MO-Phylogenetics, which is designed for the purpose of resolving inference problems with classic evolutionary algorithms and last generation algorithms. REFERENCES [1] C.A. Coello Coello, G.B. Lamont, D.A. van Veldhuizen. Evolutionary algorithms for solving multi-objective problems. Spring. Agosto 2007 [2] C. Zambrano-Vega, A.J. Nebro, J.F Aldana-Montes. MO-Phylogenetics: a phylogenetic inference software tool with multi-objective evolutionary metaheuristics. Methods in Ecology and Evolution. En prensa. Febrero 2016.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective for physics based modeling of the power converter components is to design the whole converter with respect to physical and operational constraints. Therefore, all the elements and components of the energy conversion system are modeled numerically and combined together to achieve the whole system behavioral model. Previously proposed high frequency (HF) models of power converters are based on circuit models that are only related to the parasitic inner parameters of the power devices and the connections between the components. This dissertation aims to obtain appropriate physics-based models for power conversion systems, which not only can represent the steady state behavior of the components, but also can predict their high frequency characteristics. The developed physics-based model would represent the physical device with a high level of accuracy in predicting its operating condition. The proposed physics-based model enables us to accurately develop components such as; effective EMI filters, switching algorithms and circuit topologies [7]. One of the applications of the developed modeling technique is design of new sets of topologies for high-frequency, high efficiency converters for variable speed drives. The main advantage of the modeling method, presented in this dissertation, is the practical design of an inverter for high power applications with the ability to overcome the blocking voltage limitations of available power semiconductor devices. Another advantage is selection of the best matching topology with inherent reduction of switching losses which can be utilized to improve the overall efficiency. The physics-based modeling approach, in this dissertation, makes it possible to design any power electronic conversion system to meet electromagnetic standards and design constraints. This includes physical characteristics such as; decreasing the size and weight of the package, optimized interactions with the neighboring components and higher power density. In addition, the electromagnetic behaviors and signatures can be evaluated including the study of conducted and radiated EMI interactions in addition to the design of attenuation measures and enclosures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many years have passed since Berners-Lee envi- sioned the Web as it should be (1999), but still many information professionals do not know their precise role in its development, especially con- cerning ontologies –considered one of its main elements. Why? May it still be a lack of under- standing between the different academic commu- nities involved (namely, Computer Science, Lin- guistics and Library and Information Science), as reported by Soergel (1999)? The idea behind the Semantic Web is that of several technologies working together to get optimum information re- trieval performance, which is based on proper resource description in a machine-understandable way, by means of metadata and vocabularies (Greenberg, Sutton and Campbell, 2003). This is obviously something that Library and Information Science professionals can do very well, but, are we doing enough? When computer scientists put on stage the ontology paradigm they were asking for semantically richer vocabularies that could support logical inferences in artificial intelligence as a way to improve information retrieval systems. Which direction should vocabulary development take to contribute better to that common goal? The main objective of this paper is twofold: 1) to identify main trends, issues and problems con- cerning ontology research and 2) to identify pos- sible contributions from the Library and Information Science area to the development of ontologies for the semantic web. To do so, our paper has been structured in the following manner. First, the methodology followed in the paper is reported, which is based on a thorough literature review, where main contributions are analysed. Then, the paper presents a discussion of the main trends, issues and problems concerning ontology re- search identified in the literature review. Recom- mendations of possible contributions from the Library and Information Science area to the devel- opment of ontologies for the semantic web are finally presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, radars have been used in many applications such as precision agriculture and advanced driver assistant systems. Optimal techniques for the estimation of the number of targets and of their coordinates require solving multidimensional optimization problems entailing huge computational efforts. This has motivated the development of sub-optimal estimation techniques able to achieve good accuracy at a manageable computational cost. Another technical issue in advanced driver assistant systems is the tracking of multiple targets. Even if various filtering techniques have been developed, new efficient and robust algorithms for target tracking can be devised exploiting a probabilistic approach, based on the use of the factor graph and the sum-product algorithm. The two contributions provided by this dissertation are the investigation of the filtering and smoothing problems from a factor graph perspective and the development of efficient algorithms for two and three-dimensional radar imaging. Concerning the first contribution, a new factor graph for filtering is derived and the sum-product rule is applied to this graphical model; this allows to interpret known algorithms and to develop new filtering techniques. Then, a general method, based on graphical modelling, is proposed to derive filtering algorithms that involve a network of interconnected Bayesian filters. Finally, the proposed graphical approach is exploited to devise a new smoothing algorithm. Numerical results for dynamic systems evidence that our algorithms can achieve a better complexity-accuracy tradeoff and tracking capability than other techniques in the literature. Regarding radar imaging, various algorithms are developed for frequency modulated continuous wave radars; these algorithms rely on novel and efficient methods for the detection and estimation of multiple superimposed tones in noise. The accuracy achieved in the presence of multiple closely spaced targets is assessed on the basis of both synthetically generated data and of the measurements acquired through two commercial multiple-input multiple-output radars.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, we investigate the role of applied physics in epidemiological surveillance through the application of mathematical models, network science and machine learning. The spread of a communicable disease depends on many biological, social, and health factors. The large masses of data available make it possible, on the one hand, to monitor the evolution and spread of pathogenic organisms; on the other hand, to study the behavior of people, their opinions and habits. Presented here are three lines of research in which an attempt was made to solve real epidemiological problems through data analysis and the use of statistical and mathematical models. In Chapter 1, we applied language-inspired Deep Learning models to transform influenza protein sequences into vectors encoding their information content. We then attempted to reconstruct the antigenic properties of different viral strains using regression models and to identify the mutations responsible for vaccine escape. In Chapter 2, we constructed a compartmental model to describe the spread of a bacterium within a hospital ward. The model was informed and validated on time series of clinical measurements, and a sensitivity analysis was used to assess the impact of different control measures. Finally (Chapter 3) we reconstructed the network of retweets among COVID-19 themed Twitter users in the early months of the SARS-CoV-2 pandemic. By means of community detection algorithms and centrality measures, we characterized users’ attention shifts in the network, showing that scientific communities, initially the most retweeted, lost influence over time to national political communities. In the Conclusion, we highlighted the importance of the work done in light of the main contemporary challenges for epidemiological surveillance. In particular, we present reflections on the importance of nowcasting and forecasting, the relationship between data and scientific research, and the need to unite the different scales of epidemiological surveillance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims to evaluate the frequency and severity of nausea and vomiting using two different instruments and relate them to quality of life (QOL) in patients with cancer receiving antineoplastic treatment. Severity of chemotherapy-induced nausea and vomiting (CINV) was measured by Common Terminology Criteria for Adverse Events (CTCAE) and a numerical scale. QOL was assessed using the Functional Assessment of Cancer Therapy-General questionnaire. Of the 50 patients studied, 60.0% reported nausea (40.0% CTCAE grade 1; 66.7% moderate intensity on numerical scale) and 30.0% reported vomiting (46.7% CTCAE grades 1 and 2, each; 66.7% moderate intensity on numerical scale). CINV did not influence overall QOL. The frequency of CINV was high. There was no association between nausea/vomiting and overall QOL.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Below cloud scavenging processes have been investigated considering a numerical simulation, local atmospheric conditions and particulate matter (PM) concentrations, at different sites in Germany. The below cloud scavenging model has been coupled with bulk particulate matter counter TSI (Trust Portacounter dataset, consisting of the variability prediction of the particulate air concentrations during chosen rain events. The TSI samples and meteorological parameters were obtained during three winter Campaigns: at Deuselbach, March 1994, consisting in three different events; Sylt, April 1994 and; Freiburg, March 1995. The results show a good agreement between modeled and observed air concentrations, emphasizing the quality of the conceptual model used in the below cloud scavenging numerical modeling. The results between modeled and observed data have also presented high square Pearson coefficient correlations over 0.7 and significant, except the Freiburg Campaign event. The differences between numerical simulations and observed dataset are explained by the wind direction changes and, perhaps, the absence of advection mass terms inside the modeling. These results validate previous works based on the same conceptual model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, we have experienced increasing interest in the understanding of the physical properties of collisionless plasmas, mostly because of the large number of astrophysical environments (e. g. the intracluster medium (ICM)) containing magnetic fields that are strong enough to be coupled with the ionized gas and characterized by densities sufficiently low to prevent the pressure isotropization with respect to the magnetic line direction. Under these conditions, a new class of kinetic instabilities arises, such as firehose and mirror instabilities, which have been studied extensively in the literature. Their role in the turbulence evolution and cascade process in the presence of pressure anisotropy, however, is still unclear. In this work, we present the first statistical analysis of turbulence in collisionless plasmas using three-dimensional numerical simulations and solving double-isothermal magnetohydrodynamic equations with the Chew-Goldberger-Low laws closure (CGL-MHD). We study models with different initial conditions to account for the firehose and mirror instabilities and to obtain different turbulent regimes. We found that the CGL-MHD subsonic and supersonic turbulences show small differences compared to the MHD models in most cases. However, in the regimes of strong kinetic instabilities, the statistics, i.e. the probability distribution functions (PDFs) of density and velocity, are very different. In subsonic models, the instabilities cause an increase in the dispersion of density, while the dispersion of velocity is increased by a large factor in some cases. Moreover, the spectra of density and velocity show increased power at small scales explained by the high growth rate of the instabilities. Finally, we calculated the structure functions of velocity and density fluctuations in the local reference frame defined by the direction of magnetic lines. The results indicate that in some cases the instabilities significantly increase the anisotropy of fluctuations. These results, even though preliminary and restricted to very specific conditions, show that the physical properties of turbulence in collisionless plasmas, as those found in the ICM, may be very different from what has been largely believed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context. Fossil systems are defined to be X- ray bright galaxy groups ( or clusters) with a two- magnitude difference between their two brightest galaxies within half the projected virial radius, and represent an interesting extreme of the population of galaxy agglomerations. However, the physical conditions and processes leading to their formation are still poorly constrained. Aims. We compare the outskirts of fossil systems with that of normal groups to understand whether environmental conditions play a significant role in their formation. We study the groups of galaxies in both, numerical simulations and observations. Methods. We use a variety of statistical tools including the spatial cross- correlation function and the local density parameter Delta(5) to probe differences in the density and structure of the environments of "" normal"" and "" fossil"" systems in the Millennium simulation. Results. We find that the number density of galaxies surrounding fossil systems evolves from greater than that observed around normal systems at z = 0.69, to lower than the normal systems by z = 0. Both fossil and normal systems exhibit an increment in their otherwise radially declining local density measure (Delta(5)) at distances of order 2.5 r(vir) from the system centre. We show that this increment is more noticeable for fossil systems than normal systems and demonstrate that this difference is linked to the earlier formation epoch of fossil groups. Despite the importance of the assembly time, we show that the environment is different for fossil and non- fossil systems with similar masses and formation times along their evolution. We also confirm that the physical characteristics identified in the Millennium simulation can also be detected in SDSS observations. Conclusions. Our results confirm the commonly held belief that fossil systems assembled earlier than normal systems but also show that the surroundings of fossil groups could be responsible for the formation of their large magnitude gap.