956 resultados para Computer models


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Internet is fully inserted in contemporary society, specially in relation to entertainment services and trading. Its reach has transposed the traditional desktop computer models coming to mobile devices like cell phones and GPS receivers. Likewise, the scientific community takes its benefits, both for publication of studies and for communication between clusters processing information, such as at LHC, located in Switzerland. Concerning geodetic positioning, researches in the area present the concept of Virtual Reference Stations - VRS, in which is necessary a communication way between the real reference stations and a central system as well as between central system and a service requester. In this work, we analyze the current solutions for generation of VRS with regard to data delivery for the service requester and present a solution based on Web Services as an alternative to the model being developed by Spatial Geodesy Study Group – GEGE/FCT/UNESP. Comparing solutions, it was verified the potential of Web Services to aid in researches of geodetic positioning using VRS. Using such technology, it is obtained interoperability, providing greater flexibility to develop client applications, both development carried out by researchers of the university or by any person or enterprise wishing to use the service

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sowohl in der Natur als auch in der Industrie existieren thermisch induzierte Strömungen. Von Interesse für diese Forschungsarbeit sind dabei die Konvektionen im Erdmantel sowie in den Glasschmelzwannen. Der dort stattfindende Materialtransport resultiert aus Unterschieden in der Dichte, der Temperatur und der chemischen Konzentration innerhalb des konvektierenden Materials. Um das Verständnis für die ablaufenden Prozesse zu verbessern, werden von zahlreichen Forschergruppen numerische Modellierungen durchgeführt. Die Verifikation der dafür verwendeten Algorithmen erfolgt meist über die Analyse von Laborexperimenten. Im Vordergrund dieser Forschungsarbeit steht die Entwicklung einer Methode zur Bestimmung der dreidimensionalen Temperaturverteilung für die Untersuchung von thermisch induzierten Strömungen in einem Versuchsbecken. Eine direkte Temperaturmessung im Inneren des Versuchsmaterials bzw. der Glasschmelze beeinflusst allerdings das Strömungsverhalten. Deshalb wird die geodynamisch störungsfrei arbeitende Impedanztomographie verwendet. Die Grundlage dieser Methode bildet der erweiterte Arrhenius-Zusammenhang zwischen Temperatur und spezifischer elektrischer Leitfähigkeit. Während der Laborexperimente wird ein zähflüssiges Polyethylenglykol-Wasser-Gemisch in einem Becken von unten her erhitzt. Die auf diese Weise generierten Strömungen stellen unter Berücksichtigung der Skalierung ein Analogon sowohl zu dem Erdmantel als auch zu den Schmelzwannen dar. Über mehrere Elektroden, die an den Beckenwänden installiert sind, erfolgen die geoelektrischen Messungen. Nach der sich anschließenden dreidimensionalen Inversion der elektrischen Widerstände liegt das Modell mit der Verteilung der spezifischen elektrischen Leitfähigkeit im Inneren des Versuchsbeckens vor. Diese wird mittels der erweiterten Arrhenius-Formel in eine Temperaturverteilung umgerechnet. Zum Nachweis der Eignung dieser Methode für die nichtinvasive Bestimmung der dreidimensionalen Temperaturverteilung wurden mittels mehrerer Thermoelemente an den Beckenwänden zusätzlich direkte Temperaturmessungen durchgeführt und die Werte miteinander verglichen. Im Wesentlichen sind die Innentemperaturen gut rekonstruierbar, wobei die erreichte Messgenauigkeit von der räumlichen und zeitlichen Auflösung der Gleichstromgeoelektrik abhängt.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As environmental problems became more complex, policy and regulatory decisions become far more difficult to make. The use of science has become an important practice in the decision making process of many federal agencies. Many different types of scientific information are used to make decisions within the EPA, with computer models becoming especially important. Environmental models are used throughout the EPA in a variety of contexts and their predictive capacity has become highly valued in decision making. The main focus of this research is to examine the EPA’s Council for Regulatory Modeling (CREM) as a case study in addressing science issues, particularly models, in government agencies. Specifically, the goal was to answer the following questions: What is the history of the CREM and how can this information shed light on the process of science policy implementation? What were the goals of implementing the CREM? Were these goals reached and how have they changed? What have been the impediments that the CREM has faced and why did these impediments occur? The three main sources of information for this research came from observations during summer employment with the CREM, document review and supplemental interviews with CREM participants and other members of the modeling community. Examining a history of modeling at the EPA, as well as a history of the CREM, provides insight into the many challenges that are faced when implementing science policy and science policy programs. After examining the many impediments that the CREM has faced in implementing modeling policies, it was clear that the impediments fall into two separate categories, classic and paradoxical. The classic impediments include the more standard impediments to science policy implementation that might be found in any regulatory environment, such as lack of resources and changes in administration. Paradoxical impediments are cyclical in nature, with no clear solution, such as balancing top-down versus bottom-up initiatives and coping with differing perceptions. These impediments, when not properly addressed, severely hinder the ability for organizations to successfully implement science policy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A diesel oxidation catalyst (DOC) with a catalyzed diesel particulate filter (CPF) is an effective exhaust aftertreatment device that reduces particulate emissions from diesel engines, and properly designed DOC-CPF systems provide passive regeneration of the filter by the oxidation of PM via thermal and NO2/temperature-assisted means under various vehicle duty cycles. However, controlling the backpressure on engines caused by the addition of the CPF to the exhaust system requires a good understanding of the filtration and oxidation processes taking place inside the filter as the deposition and oxidation of solid particulate matter (PM) change as functions of loading time. In order to understand the solid PM loading characteristics in the CPF, an experimental and modeling study was conducted using emissions data measured from the exhaust of a John Deere 6.8 liter, turbocharged and after-cooled engine with a low-pressure loop EGR system and a DOC-CPF system (or a CCRT® - Catalyzed Continuously Regenerating Trap®, as named by Johnson Matthey) in the exhaust system. A series of experiments were conducted to evaluate the performance of the DOC-only, CPF-only and DOC-CPF configurations at two engine speeds (2200 and 1650 rpm) and various loads on the engine ranging from 5 to 100% of maximum torque at both speeds. Pressure drop across the DOC and CPF, mass deposited in the CPF at the end of loading, upstream and downstream gaseous and particulate emissions, and particle size distributions were measured at different times during the experiments to characterize the pressure drop and filtration efficiency of the DOCCPF system as functions of loading time. Pressure drop characteristics measured experimentally across the DOC-CPF system showed a distinct deep-bed filtration region characterized by a non-linear pressure drop rise, followed by a transition region, and then by a cake-filtration region with steadily increasing pressure drop with loading time at engine load cases with CPF inlet temperatures less than 325 °C. At the engine load cases with CPF inlet temperatures greater than 360 °C, the deep-bed filtration region had a steep rise in pressure drop followed by a decrease in pressure drop (due to wall PM oxidation) in the cake filtration region. Filtration efficiencies observed during PM cake filtration were greater than 90% in all engine load cases. Two computer models, i.e., the MTU 1-D DOC model and the MTU 1-D 2-layer CPF model were developed and/or improved from existing models as part of this research and calibrated using the data obtained from these experiments. The 1-D DOC model employs a three-way catalytic reaction scheme for CO, HC and NO oxidation, and is used to predict CO, HC, NO and NO2 concentrations downstream of the DOC. Calibration results from the 1-D DOC model to experimental data at 2200 and 1650 rpm are presented. The 1-D 2-layer CPF model uses a ‘2-filters in series approach’ for filtration, PM deposition and oxidation in the PM cake and substrate wall via thermal (O2) and NO2/temperature-assisted mechanisms, and production of NO2 as the exhaust gas mixture passes through the CPF catalyst washcoat. Calibration results from the 1-D 2-layer CPF model to experimental data at 2200 rpm are presented. Comparisons of filtration and oxidation behavior of the CPF at sample load-cases in both configurations are also presented. The input parameters and selected results are also compared with a similar research work with an earlier version of the CCRT®, to compare and explain differences in the fundamental behavior of the CCRT® used in these two research studies. An analysis of the results from the calibrated CPF model suggests that pressure drop across the CPF depends mainly on PM loading and oxidation in the substrate wall, and also that the substrate wall initiates PM filtration and helps in forming a PM cake layer on the wall. After formation of the PM cake layer of about 1-2 µm on the wall, the PM cake becomes the primary filter and performs 98-99% of PM filtration. In all load cases, most of PM mass deposited was in the PM cake layer, and PM oxidation in the PM cake layer accounted for 95-99% of total PM mass oxidized during loading. Overall PM oxidation efficiency of the DOC-CPF device increased with increasing CPF inlet temperatures and NO2 flow rates, and was higher in the CCRT® configuration compared to the CPF-only configuration due to higher CPF inlet NO2 concentrations. Filtration efficiencies greater than 90% were observed within 90-100 minutes of loading time (starting with a clean filter) in all load cases, due to the fact that the PM cake on the substrate wall forms a very efficient filter. A good strategy for maintaining high filtration efficiency and low pressure drop of the device while performing active regeneration would be to clean the PM cake filter partially (i.e., by retaining a cake layer of 1-2 µm thickness on the substrate wall) and to completely oxidize the PM deposited in the substrate wall. The data presented support this strategy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Biofuels are an increasingly important component of worldwide energy supply. This research aims to understand the pathways and impacts of biofuels production, and to improve these processes to make them more efficient. In Chapter 2, a life cycle assessment (LCA) is presented for cellulosic ethanol production from five potential feedstocks of regional importance to the upper Midwest - hybrid poplar, hybrid willow, switchgrass, diverse prairie grasses, and logging residues - according to the requirements of Renewable Fuel Standard (RFS). Direct land use change emissions are included for the conversion of abandoned agricultural land to feedstock production, and computer models of the conversion process are used in order to determine the effect of varying biomass composition on overall life cycle impacts. All scenarios analyzed here result in greater than 60% reduction in greenhouse gas emissions relative to petroleum gasoline. Land use change effects were found to contribute significantly to the overall emissions for the first 20 years after plantation establishment. Chapter 3 is an investigation of the effects of biomass mixtures on overall sugar recovery from the combined processes of dilute acid pretreatment and enzymatic hydrolysis. Biomass mixtures studied were aspen, a hardwood species well suited to biochemical processing; balsam, a high-lignin softwood species, and switchgrass, an herbaceous energy crop with high ash content. A matrix of three different dilute acid pretreatment severities and three different enzyme loading levels was used to characterize interactions between pretreatment and enzymatic hydrolysis. Maximum glucose yield for any species was 70% oftheoretical for switchgrass, and maximum xylose yield was 99.7% of theoretical for aspen. Supplemental β-glucosidase increased glucose yield from enzymatic hydrolysis by an average of 15%, and total sugar recoveries for mixtures could be predicted to within 4% by linear interpolation of the pure species results. Chapter 4 is an evaluation of the potential for producing Trichoderma reesei cellulose hydrolases in the Kluyveromyces lactis yeast expression system. The exoglucanases Cel6A and Cel7A, and the endoglucanase Cel7B were inserted separately into the K. lactis and the enzymes were analyzed for activity on various substrates. Recombinant Cel7B was found to be active on carboxymethyl cellulose and Avicel powdered cellulose substrates. Recombinant Cel6A was also found to be active on Avicel. Recombinant Cel7A was produced, but no enzymatic activity was detected on any substrate. Chapter 5 presents a new method for enzyme improvement studies using enzyme co-expression and yeast growth rate measurements as a potential high-throughput expression and screening system in K. lactis yeast. Two different K. lactis strains were evaluated for their usefulness in growth screening studies, one wild-type strain and one strain which has had the main galactose metabolic pathway disabled. Sequential transformation and co-expression of the exoglucanase Cel6A and endoglucanase Cel7B was performed, and improved hydrolysis rates on Avicel were detectable in the cell culture supernatant. Future work should focus on hydrolysis of natural substrates, developing the growth screening method, and utilizing the K. lactis expression system for directed evolution of enzymes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bending shear was observed to produce nearly vertical shear bands in a calving ice wall standing on dry land on Deception Island (Iat. 63.0 oS., long. 60.6 W.), and slabs calved straight downward when shear rupture occurred along these shear bands (Hughes, 1989). A formula for the calving rate was developed from the Deception Island data, and we have attempted to justify generalizing this formula to include ice walls standing along beaches or in water. These are environments in which a wave-washed groove develops along the base of the ice wall or along a water line above the base. The rate of wave erosion provides an alternative mechanism for controlling the calving rate in these environments. We have determined that the rate at which bending creep produces nearly vertical shear bands, along which shear r upture occurs, controls the calving rate in all environments. Shear rupture occurs at a calving shear stress of about I bar. Our results justify using the calving formula to compute the calving rate of ice walls in computer models of ice-sheet dynamics. This is especially important in simulating retreat of Northern Hemisphere ice sheets during the last deglaciation, when marine and lacustrine environments were common along retreating ice margins. These margins would have been ice walls standing along beaches or in water, because floating ice shelves are not expected in the ablation zone of retreating ice sheets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Changes of porosity, permeability, and tortuosity due to physical and geochemical processes are of vital importance for a variety of hydrogeological systems, including passive treatment facilities for contaminated groundwater, engineered barrier systems (EBS), and host rocks for high-level nuclear waste (HLW) repositories. Due to the nonlinear nature and chemical complexity of the problem, in most cases, it is impossible to verify reactive transport codes analytically, and code intercomparisons are the most suitable method to assess code capabilities and model performance. This paper summarizes model intercomparisons for six hypothetical scenarios with generally increasing geochemical or physical complexity using the reactive transport codes CrunchFlow, HP1, MIN3P, PFlotran, and TOUGHREACT. Benchmark problems include the enhancement of porosity and permeability through mineral dissolution, as well as near complete clogging due to localized mineral precipitation, leading to reduction of permeability and tortuosity. Processes considered in the benchmark simulations are advective-dispersive transport in saturated media, kinetically controlled mineral dissolution-precipitation, and aqueous complexation. Porosity changes are induced by mineral dissolution-precipitation reactions, and the Carman-Kozeny relationship is used to describe changes in permeability as a function of porosity. Archie’s law is used to update the tortuosity and the pore diffusion coefficient as a function of porosity. Results demonstrate that, generally, good agreement is reached amongst the computer models despite significant differences in model formulations. Some differences are observed, in particular for the more complex scenarios involving clogging; however, these differences do not affect the interpretation of system behavior and evolution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The radiation dose rates at flight altitudes can increase by orders of magnitude for a short time during energetic solar cosmic ray events, so called ground level enhancements (GLEs). Especially at high latitudes and flight altitudes, solar energetic particles superposed on galactic cosmic rays may cause radiation that exceeds the maximum allowed dosage limit for the general public. Therefore the determination of the radiation dose rate during GLEs should be as reliable as possible. Radiation dose rates along flight paths are typically determined by computer models that are based on cosmic ray flux and anisotropy parameters derived from neutron monitor and/or satellite measurements. The characteristics of the GLE on 15 April 2001 (GLE60) were determined and published by various authors. In this work we compare these results and investigate the consequences on the computed radiation dose rates along selected flight paths. In addition, we compare the computed radiation dose rates with measurements that were made during GLE60 on board two transatlantic flights.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mass estimates for Late Miocene and Pliocene (8.6-3.25 Ma) Discoaster species and Sphenolithus are determined using samples of the equatorial Atlantic (Ceara Rise: ODP Site 927). Based on morphometric measurements, 3D computer models were created for 11 Discoaster species and their volumes calculated. From these, shape factors (ks) were derived to allow calculation of mass for different-sized discoasters and Sphenolithus abies. The mass estimates were then used to calculate the contribution of nannofossils to the total nannofossil carbonate. The discoaster contribution ranges from 10% to 40%, with a decreasing trend through the investigated interval. However, our estimates of total nannofossil carbonate from size-corrected abundance data are consistently 30-50% lower than estimates from grain-size measurement; this suggests that data based on mass estimates need to be interpreted with caution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mass transport and mass flux values for the different types of glaciers in the Sør-Rondane are calculated from computer models, based upon gravity data and geodetic stake velocity measurements. The results are interpreted in the light of a general flow line analysis, glacial geological investigations and of the ablation terms of the mass balance for Dronning Maud Land and Antarctica.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Actualmente existen aplicaciones que permiten simular el comportamiento de bacterias en distintos hábitats y los procesos que ocurren en estos para facilitar su estudio y experimentación sin la necesidad de un laboratorio. Una de las aplicaciones de software libre para la simulación de poblaciones bacteriológicas mas usada es iDynoMiCS (individual-based Dynamics of Microbial Communities Simulator), un simulador basado en agentes que permite trabajar con varios modelos computacionales de bacterias en 2D y 3D. Este simulador permite una gran libertad al configurar una numerosa cantidad de variables con respecto al entorno, reacciones químicas y otros detalles importantes. Una característica importante es el poder simular de manera sencilla la conjugación de plásmidos entre bacterias. Los plásmidos son moléculas de ADN diferentes del cromosoma celular, generalmente circularles, que se replican, transcriben y conjugan independientemente del ADN cromosómico. Estas están presentes normalmente en bacterias procariotas, y en algunas ocasiones en eucariotas, sin embargo, en este tipo de células son llamados episomas. Dado el complejo comportamiento de los plásmidos y la gama de posibilidades que estos presentan como mecanismos externos al funcionamiento básico de la célula, en la mayoría de los casos confiriéndole distintas ventajas evolutivas, como por ejemplo: resistencia antibiótica, entre otros, resulta importante su estudio y subsecuente manipulación. Sin embargo, el marco operativo del iDynoMiCS, en cuanto a simulación de plásmidos se refiere, es demasiado sencillo y no permite realizar operaciones más complejas que el análisis de la propagación de un plásmido en la comunidad. El presente trabajo surge para resolver esta deficiencia de iDynomics. Aquí se analizarán, desarrollarán e implementarán las modificaciones necesarias para que iDynomics pueda simular satisfactoriamente y mas apegado a la realidad la conjugación de plásmidos y permita así mismo resolver distintas operaciones lógicas, como lo son los circuitos genéticos, basadas en plásmidos. También se analizarán los resultados obtenidos de acuerdo a distintos estudios relevantes y a la comparación de los resultados obtenidos con el código original de iDynomics. Adicionalmente se analizará un estudio comparando la eficiencia de detección de una sustancia mediante dos circuitos genéticos distintos. Asimismo el presente trabajo puede tener interés para el grupo LIA de la Facultad de Informática de la Universidad Politécnica de Madrid, el cual está participando en el proyecto europeo BACTOCOM que se centra en el estudio de la conjugación de plásmidos y circuitos genéticos. --ABSTRACT--Currently there are applications that simulate the behavior of bacteria in different habitats and the ongoing processes inside them to facilitate their study and experimentation without the need for an actual laboratory. One of the most used open source applications to simulate bacterial populations is iDynoMiCS (individual-based Dynamics of Microbial Communities Simulator), an agent-based simulator that allows working with several computer models of 2D and 3D bacteria in biofilms. This simulator allows great freedom by means of a large number of configurable variables regarding environment, chemical reactions and other important details of the simulation. Within these characteristics there exists a very basic framework to simulate plasmid conjugation. Plasmids are DNA molecules physically different from the cell’s chromosome, commonly found as small circular, double-stranded DNA molecules that are replicated, conjugated and transcribed independently of chromosomal DNA. These bacteria are normally present in prokaryotes and sometimes in eukaryotes, which in this case these cells are called episomes. Plasmids are external mechanisms to the cells basic operations, and as such, in the majority of the cases, confer to the host cell various evolutionary advantages, like antibiotic resistance for example. It is mperative to further study plasmids and the possibilities they present. However, the operational framework of the iDynoMiCS plasmid simulation is too simple, and does not allow more complex operations that the analysis of the spread of a plasmid in the community. This project was conceived to resolve this particular deficiency in iDynomics, moreover, in this paper is discussed, developed and implemented the necessary changes to iDynomics simulation software so it can satisfactorily and realistically simulate plasmid conjugation, and allow the possibility to solve various ogic operations, such as plasmid-based genetic circuits. Moreover the results obtained will be analyzed and compared with other relevant studies and with those obtained with the original iDynomics code. Conjointly, an additional study detailing the sensing of a substance with two different genetic circuits will be presented. This work may also be relevant to the LIA group of the Faculty of Informatics of the Polytechnic University of Madrid, which is participating in the European project BACTOCOM that focuses on the study of the of plasmid conjugation and genetic circuits.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Kinépolis Madrid es uno de los mayores complejos cinematográficos del mundo contando incluso con records Guinness como el del complejo cinematográfico con mayor número de butacas del mundo. Está compuesto por 25 salas con capacidades entre 220 y 996 espectadores. Todas estas salas están equipadas con las últimas tecnologías de sonido e imagen y están adecuadamente acondicionadas para que las características acústicas de las mismas sean óptimas; no obstante, en el complejo no disponen de información sobre estas características. El presente PFG tratará de medir algunos de estos parámetros acústicos como la claridad, la definición o la inteligibilidad de la sala, pero se prestará especial atención al tiempo de reverberación de la misma ya que es uno de los parámetros más significativos a la hora de caracterizar acústicamente una sala. En concreto, se trabajará sobre la sala número 3, con capacidad para 327 espectadores, lo que la convierte en una de las salas de medio tamaño del recinto. Por otro lado, además de medir las características acústicas de la sala, se medirán las dimensiones de la misma para, posteriormente, construir dos modelos virtuales de la misma. Uno de ellos será un modelo detallado, mientras que el otro será más simple. A partir de estos modelos, se realizarán simulaciones para obtener los mismos parámetros medidos en la sala real. Una vez se obtengan los parámetros acústicos de ambas maneras, se compararán las medidas entre sí, estudiando si las diferencias entre los medidos y los simulados superan ciertos umbrales que estimarán si los modelos creados por ordenador realmente pueden representar a la sala real, o no. Por último, se obtendrán conclusiones para saber cuál de los dos modelos creados se acerca más a las medidas reales, cómo realizar las simulaciones, qué tipos de señal utilizar en las medidas y qué parámetros tener en cuenta, para así facilitar el trabajo en futuras experiencias ahorrando tiempo. ABSTRACT. Kinepolis Madrid is one of the largest cinema complexes in the world, having won even a Guinness as the cinema complex with more seats in the world. It consists of 25 cinemas whose capacities are between 220 and 996 people. All these cinemas are fully equipped with the latest audio and video technologies and are appropriately conditioned for the optimal acoustic characteristics; nevertheless, the resort does not have information on these features. This PFG’s aim, is trying to measure some acoustic parameters such as clarity, definition or intelligibility of the room, but paying special attention to the reverberation time since, it is one of the most significant acoustic parameters to characterize a room. In particular, the study will be developed at the cinema number 3 with capacity for 327 spectators, which turns it into one of the rooms of average size of the enclosure. In addition to measuring the acoustic characteristics of the room, the dimensions of it will be also measured, to then build two virtual models of it. One will be a detailed model, while the other one, will be much simpler. After that, simulations from those models will be performed in order to obtain the same data measured at the real room. Once the acoustic parameters had been obtained in both ways, there will be a comparison of all the measures together, studying whether differences between the real measured data and simulated ones exceed an estimated limit. This comparison, will give information about whether the computer models created can really represent the real room or not. Finally, conclusions to know which of the two created models is more appropriate to use, how to perform the simulations, what types of signal should be used in the measurements and which parameters to take into account in order to facilitate and saving time in future experiences, will be drawn.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A pesquisa apresenta uma adaptação do modelo matemático de lógica nebulosa. A adaptação é uma alternativa capaz de representar o comportamento de uma variável subjetiva ao longo de um intervalo de tempo, assim como tratar variáveis estáticas (como o modelo computacional existente). Pesquisas realizadas apontam para uma lacuna no tratamento de variáveis dinâmicas (dependência no tempo) e a proposta permite que o contexto em que as variáveis estão inseridas tenha um papel no entendimento e tomada de decisão de problemas com estas características. Modelos computacionais existentes tratam a questão temporal como sequenciador de eventos ou custo, sem considerar a influência de fenômenos passados na condição corrente, ao contrário do modelo proposto que permite uma contribuição dos acontecimentos anteriores no entendimento e tratamento do estado atual. Apenas para citar alguns exemplos, o uso da solução proposta pode ser aplicado na determinação de nível de conforto em transporte público ou auxiliar na aferição de grau de risco de investimentos no mercado de ações. Em ambos os casos, comparações realizadas entre o modelo de lógica nebulosa existente e a adaptação sugerida apontam uma diferença no resultado final que pode ser entendida como uma maior qualidade na informação de suporte à tomada de decisão.