982 resultados para Modeling problems
Resumo:
Die Wechselwirkung zwischen Proteinen und anorganischen Oberflächen fasziniert sowohl aus angewandter als auch theoretischer Sicht. Sie ist ein wichtiger Aspekt in vielen Anwendungen, unter anderem in chirugischen Implantaten oder Biosensoren. Sie ist außerdem ein Beispiel für theoretische Fragestellungen betreffend die Grenzfläche zwischen harter und weicher Materie. Fest steht, dass Kenntnis der beteiligten Mechanismen erforderlich ist um die Wechselwirkung zwischen Proteinen und Oberflächen zu verstehen, vorherzusagen und zu optimieren. Aktuelle Fortschritte im experimentellen Forschungsbereich ermöglichen die Untersuchung der direkten Peptid-Metall-Bindung. Dadurch ist die Erforschung der theoretischen Grundlagen weiter ins Blickfeld aktueller Forschung gerückt. Eine Möglichkeit die Wechselwirkung zwischen Proteinen und anorganischen Oberflächen zu erforschen ist durch Computersimulationen. Obwohl Simulationen von Metalloberflächen oder Proteinen als Einzelsysteme schon länger verbreitet sind, bringt die Simulation einer Kombination beider Systeme neue Schwierigkeiten mit sich. Diese zu überwinden erfordert ein Mehrskalen-Verfahren: Während Proteine als biologische Systeme ausreichend mit klassischer Molekulardynamik beschrieben werden können, bedarf die Beschreibung delokalisierter Elektronen metallischer Systeme eine quantenmechanische Formulierung. Die wichtigste Voraussetzung eines Mehrskalen-Verfahrens ist eine Übereinstimmung der Simulationen auf den verschiedenen Skalen. In dieser Arbeit wird dies durch die Verknüpfung von Simulationen alternierender Skalen erreicht. Diese Arbeit beginnt mit der Untersuchung der Thermodynamik der Benzol-Hydratation mittels klassischer Molekulardynamik. Dann wird die Wechselwirkung zwischen Wasser und den [111]-Metalloberflächen von Gold und Nickel mittels eines Multiskalen-Verfahrens modelliert. In einem weiteren Schritt wird die Adsorbtion des Benzols an Metalloberflächen in wässriger Umgebung studiert. Abschließend wird die Modellierung erweitert und auch die Aminosäuren Alanin und Phenylalanin einbezogen. Dies eröffnet die Möglichkeit realistische Protein- Metall-Systeme in Computersimulationen zu betrachten und auf theoretischer Basis die Wechselwirkung zwischen Peptiden und Oberflächen für jede Art Peptide und Oberfläche vorauszusagen.
Resumo:
Population growth in urban areas is a world-wide phenomenon. According to a recent United Nations report, over half of the world now lives in cities. Numerous health and environmental issues arise from this unprecedented urbanization. Recent studies have demonstrated the effectiveness of urban green spaces and the role they play in improving both the aesthetics and the quality of life of its residents. In particular, urban green spaces provide ecosystem services such as: urban air quality improvement by removing pollutants that can cause serious health problems, carbon storage, carbon sequestration and climate regulation through shading and evapotranspiration. Furthermore, epidemiological studies with controlled age, sex, marital and socio-economic status, have provided evidence of a positive relationship between green space and the life expectancy of senior citizens. However, there is little information on the role of public green spaces in mid-sized cities in northern Italy. To address this need, a study was conducted to assess the ecosystem services of urban green spaces in the city of Bolzano, South Tyrol, Italy. In particular, we quantified the cooling effect of urban trees and the hourly amount of pollution removed by the urban forest. The information was gathered using field data collected through local hourly air pollution readings, tree inventory and simulation models. During the study we quantified pollution removal for ozone, nitrogen dioxide, carbon monoxide and particulate matter (<10 microns). We estimated the above ground carbon stored and annually sequestered by the urban forest. Results have been compared to transportation CO2 emissions to determine the CO2 offset potential of urban streetscapes. Furthermore, we assessed commonly used methods for estimating carbon stored and sequestered by urban trees in the city of Bolzano. We also quantified ecosystem disservices such as hourly urban forest volatile organic compound emissions.
Resumo:
Die vorliegende Arbeit ist motiviert durch biologische Fragestellungen bezüglich des Verhaltens von Membranpotentialen in Neuronen. Ein vielfach betrachtetes Modell für spikende Neuronen ist das Folgende. Zwischen den Spikes verhält sich das Membranpotential wie ein Diffusionsprozess X der durch die SDGL dX_t= beta(X_t) dt+ sigma(X_t) dB_t gegeben ist, wobei (B_t) eine Standard-Brown'sche Bewegung bezeichnet. Spikes erklärt man wie folgt. Sobald das Potential X eine gewisse Exzitationsschwelle S überschreitet entsteht ein Spike. Danach wird das Potential wieder auf einen bestimmten Wert x_0 zurückgesetzt. In Anwendungen ist es manchmal möglich, einen Diffusionsprozess X zwischen den Spikes zu beobachten und die Koeffizienten der SDGL beta() und sigma() zu schätzen. Dennoch ist es nötig, die Schwellen x_0 und S zu bestimmen um das Modell festzulegen. Eine Möglichkeit, dieses Problem anzugehen, ist x_0 und S als Parameter eines statistischen Modells aufzufassen und diese zu schätzen. In der vorliegenden Arbeit werden vier verschiedene Fälle diskutiert, in denen wir jeweils annehmen, dass das Membranpotential X zwischen den Spikes eine Brown'sche Bewegung mit Drift, eine geometrische Brown'sche Bewegung, ein Ornstein-Uhlenbeck Prozess oder ein Cox-Ingersoll-Ross Prozess ist. Darüber hinaus beobachten wir die Zeiten zwischen aufeinander folgenden Spikes, die wir als iid Treffzeiten der Schwelle S von X gestartet in x_0 auffassen. Die ersten beiden Fälle ähneln sich sehr und man kann jeweils den Maximum-Likelihood-Schätzer explizit angeben. Darüber hinaus wird, unter Verwendung der LAN-Theorie, die Optimalität dieser Schätzer gezeigt. In den Fällen OU- und CIR-Prozess wählen wir eine Minimum-Distanz-Methode, die auf dem Vergleich von empirischer und wahrer Laplace-Transformation bezüglich einer Hilbertraumnorm beruht. Wir werden beweisen, dass alle Schätzer stark konsistent und asymptotisch normalverteilt sind. Im letzten Kapitel werden wir die Effizienz der Minimum-Distanz-Schätzer anhand simulierter Daten überprüfen. Ferner, werden Anwendungen auf reale Datensätze und deren Resultate ausführlich diskutiert.
Resumo:
This thesis explores system performance for reconfigurable distributed systems and provides an analytical model for determining throughput of theoretical systems based on the OpenSPARC FPGA Board and the SIRC Communication Framework. This model was developed by studying a small set of variables that together determine a system¿s throughput. The importance of this model is in assisting system designers to make decisions as to whether or not to commit to designing a reconfigurable distributed system based on the estimated performance and hardware costs. Because custom hardware design and distributed system design are both time consuming and costly, it is important for designers to make decisions regarding system feasibility early in the development cycle. Based on experimental data the model presented in this paper shows a close fit with less than 10% experimental error on average. The model is limited to a certain range of problems, but it can still be used given those limitations and also provides a foundation for further development of modeling reconfigurable distributed systems.
Resumo:
The long-term performance of infrastructure depends on reliable and sustainable designs. Many of Pennsylvania’s streams experience sediment transport problems that increase maintenance costs and lower structural integrity of bridge crossings. A stream restoration project is one common mitigation measure used to correct such problems at bridge crossings. Specifically, in an attempt to alleviate aggradation problems with the Old Route 15 Bridge crossing on White Deer Creek, in White Deer, PA, two in-stream structures (rock cross vanes) and several bank stabilization features were installed along with a complete channel redevelopment. The objectives of this research were to characterize the hydraulic and sediment transport processes occurring at the White Deer Creek site, and to investigate, through physical and mathematical modeling, the use of instream restoration structures. The goal is to be able to use the results of this study to prevent aggradation or other sediment related problems in the vicinity of bridges through improved design considerations. Monitoring and modeling indicate that the study site on White Deer Creek is currently unstable, experiencing general channel down-cutting, bank erosion, and several local areas of increased aggradation and degradation of the channel bed. An in-stream structure installed upstream of the Old Route 15 Bridge failed by sediment burial caused by the high sediment load that White Deer Creek is transporting as well as the backwater effects caused by the bridge crossing. The in-stream structure installed downstream of the Old Route 15 Bridge is beginning to fail because of the alignment of the structure with the approach direction of flow from upstream of the restoration structure.
Resumo:
Generalized linear mixed models (GLMMs) provide an elegant framework for the analysis of correlated data. Due to the non-closed form of the likelihood, GLMMs are often fit by computational procedures like penalized quasi-likelihood (PQL). Special cases of these models are generalized linear models (GLMs), which are often fit using algorithms like iterative weighted least squares (IWLS). High computational costs and memory space constraints often make it difficult to apply these iterative procedures to data sets with very large number of cases. This paper proposes a computationally efficient strategy based on the Gauss-Seidel algorithm that iteratively fits sub-models of the GLMM to subsetted versions of the data. Additional gains in efficiency are achieved for Poisson models, commonly used in disease mapping problems, because of their special collapsibility property which allows data reduction through summaries. Convergence of the proposed iterative procedure is guaranteed for canonical link functions. The strategy is applied to investigate the relationship between ischemic heart disease, socioeconomic status and age/gender category in New South Wales, Australia, based on outcome data consisting of approximately 33 million records. A simulation study demonstrates the algorithm's reliability in analyzing a data set with 12 million records for a (non-collapsible) logistic regression model.
Resumo:
Adding conductive carbon fillers to insulating thermoplastic resins increases composite electrical and thermal conductivity. Often, as much of a single type of carbon filler is added to achieve the desired conductivity, while still allowing the material to be molded into a bipolar plate for a fuel cell. In this study, varying amounts of three different carbons (carbon black, synthetic graphite particles, and carbon fiber) were added to Vectra A950RX Liquid Crystal Polymer. The in-plane thermal conductivity of the resulting single filler composites were tested. The results showed that adding synthetic graphite particles caused the largest increase in the in-plane thermal conductivity of the composite. The composites were modeled using ellipsoidal inclusion problems to predict the effective in-plane thermal conductivities at varying volume fractions with only physical property data of constituents. The synthetic graphite and carbon black were modeled using the average field approximation with ellipsoidal inclusions and the model showed good agreement with the experimental data. The carbon fiber polymer composite was modeled using an assemblage of coated ellipsoids and the model showed good agreement with the experimental data.
Resumo:
As environmental problems became more complex, policy and regulatory decisions become far more difficult to make. The use of science has become an important practice in the decision making process of many federal agencies. Many different types of scientific information are used to make decisions within the EPA, with computer models becoming especially important. Environmental models are used throughout the EPA in a variety of contexts and their predictive capacity has become highly valued in decision making. The main focus of this research is to examine the EPA’s Council for Regulatory Modeling (CREM) as a case study in addressing science issues, particularly models, in government agencies. Specifically, the goal was to answer the following questions: What is the history of the CREM and how can this information shed light on the process of science policy implementation? What were the goals of implementing the CREM? Were these goals reached and how have they changed? What have been the impediments that the CREM has faced and why did these impediments occur? The three main sources of information for this research came from observations during summer employment with the CREM, document review and supplemental interviews with CREM participants and other members of the modeling community. Examining a history of modeling at the EPA, as well as a history of the CREM, provides insight into the many challenges that are faced when implementing science policy and science policy programs. After examining the many impediments that the CREM has faced in implementing modeling policies, it was clear that the impediments fall into two separate categories, classic and paradoxical. The classic impediments include the more standard impediments to science policy implementation that might be found in any regulatory environment, such as lack of resources and changes in administration. Paradoxical impediments are cyclical in nature, with no clear solution, such as balancing top-down versus bottom-up initiatives and coping with differing perceptions. These impediments, when not properly addressed, severely hinder the ability for organizations to successfully implement science policy.
Resumo:
A great increase of private car ownership took place in China from 1980 to 2009 with the development of the economy. To explain the relationship between car ownership and economic and social changes, an ordinary least squares linear regression model is developed using car ownership per capita as the dependent variable with GDP, savings deposits and highway mileages per capita as the independent variables. The model is tested and corrected for econometric problems such as spurious correlation and cointegration. Finally, the regression model is used to project oil consumption by the Chinese transportation sector through 2015. The result shows that about 2.0 million barrels of oil will be consumed by private cars in conservative scenario, and about 2.6 million barrels of oil per day in high case scenario in 2015. Both of them are much higher than the consumption level of 2009, which is 1.9 million barrels per day. It also shows that the annual growth rate of oil demand by transportation is 2.7% - 3.1% per year in the conservative scenario, and 6.9% - 7.3% per year in the high case forecast scenario from 2010 to 2015. As a result, actions like increasing oil efficiency need to be taken to deal with challenges of the increasing demand for oil.
Resumo:
With the economic development of China, the demand for electricity generation is rapidly increasing. To explain electricity generation, we use gross GDP, the ratio of urban population to rural population, the average per capita income of urban residents, the electricity price for industry in Beijing, and the policy shift that took place in China. Ordinary least squares (OLS) is used to develop a model for the 1979-2009 period. During the process of designing the model, econometric methods are used to test and develop the model. The final model is used to forecast total electricity generation and assess the possible role of photovoltaic generation. Due to the high demand for resources and serious environmental problems, China is pushing to develop the photovoltaic industry. The system price of PV is falling; therefore, photovoltaics may be competitive in the future.
Resumo:
The objective of this report is to study distributed (decentralized) three phase optimal power flow (OPF) problem in unbalanced power distribution networks. A full three phase representation of the distribution networks is considered to account for the highly unbalance state of the distribution networks. All distribution network’s series/shunt components, and load types/combinations had been modeled on commercial version of General Algebraic Modeling System (GAMS), the high-level modeling system for mathematical programming and optimization. The OPF problem has been successfully implemented and solved in a centralized approach and distributed approach, where the objective is to minimize the active power losses in the entire system. The study was implemented on the IEEE-37 Node Test Feeder. A detailed discussion of all problem sides and aspects starting from the basics has been provided in this study. Full simulation results have been provided at the end of the report.
Resumo:
Much of the research in the field of participatory modeling (PM) has focused on the developed world. Few cases are focused on developing regions, and even fewer on Latin American developing countries. The work that has been done in Latin America has often involved water management, often specifically involving water users, and has not focused on the decision making stage of the policy cycle. Little work has been done to measure the effect PM may have on the perceptions and beliefs of decision makers. In fact, throughout the field of PM, very few attempts have been made to quantitatively measure changes in participant beliefs and perceptions following participation. Of the very few exceptions, none have attempted to measure the long-term change in perceptions and beliefs. This research fills that gap. As part of a participatory modeling project in Sonora, Mexico, a region with water quantity and quality problems, I measured the change in beliefs among participants about water models: ability to use and understand them, their usefulness, and their accuracy. I also measured changes in beliefs about climate change, and about water quantity problems, specifically the causes, solutions, and impacts. I also assessed participant satisfaction with the process and outputs from the participatory modeling workshops. Participants were from water agencies, academic institutions, NGOs, and independent consulting firms. Results indicated that participant comfort and self-efficacy with water models, their beliefs in the usefulness of water models, and their beliefs about the impact of water quantity problems changed significantly as a result of the workshops. I present my findings and discuss the results.
Resumo:
Telescopic systems of structural members with clearance are found in many applications, e.g., mobile cranes, rack feeders, fork lifters, stacker cranes (see Figure 1). Operating these machines, undesirable vibrations may reduce the performance and increase safety problems. Therefore, this contribution has the aim to reduce these harmful vibrations. For a better understanding, the dynamic behaviour of these constructions is analysed. The main interest is the overlapping area of each two sections of the above described systems (see markings in Figure 1) which is investigated by measurements and by computations. A test rig is constructed to determine the dynamic behaviour by measuring fundamental vibrations and higher frequent oscillations, damping coefficients, special appearances and more. For an appropriate physical model, the governing boundary value problem is derived by applying Hamilton’s principle and a classical discretisation procedure is used to generate a coupled system of nonlinear ordinary differential equations as the corresponding truncated mathematical model. On the basis of this model, a controller concept for preventing harmful vibrations is developed.
Resumo:
BACKGROUND Pelvic inflammatory disease (PID) results from the ascending spread of microorganisms, including Chlamydia trachomatis, to the upper genital tract. Screening could improve outcomes by identifying and treating chlamydial infections before they progress to PID (direct effect) or by reducing chlamydia transmission (indirect effect). METHODS We developed a compartmental model that represents a hypothetical heterosexual population and explicitly incorporates progression from chlamydia to clinical PID. Chlamydia screening was introduced, with coverage increasing each year for 10 years. We estimated the separate contributions of the direct and indirect effects of screening on PID cases prevented per 100,000 women. We explored the influence of varying the time point at which clinical PID could occur and of increasing the risk of PID after repeated chlamydial infections. RESULTS The probability of PID at baseline was 3.1% by age 25 years. After 5 years, the intervention scenario had prevented 187 PID cases per 100,000 women and after 10 years 956 PID cases per 100,000 women. At the start of screening, most PID cases were prevented by the direct effect. The indirect effect produced a small net increase in PID cases, which was outweighed by the effect of reduced chlamydia transmission after 2.2 years. The later that progression to PID occurs, the greater the contribution of the direct effect. Increasing the risk of PID with repeated chlamydial infection increases the number of PID cases prevented by screening. CONCLUSIONS This study shows the separate roles of direct and indirect PID prevention and potential harms, which cannot be demonstrated in observational studies.
Resumo:
How do probabilistic models represent their targets and how do they allow us to learn about them? The answer to this question depends on a number of details, in particular on the meaning of the probabilities involved. To classify the options, a minimalist conception of representation (Su\'arez 2004) is adopted: Modelers devise substitutes (``sources'') of their targets and investigate them to infer something about the target. Probabilistic models allow us to infer probabilities about the target from probabilities about the source. This leads to a framework in which we can systematically distinguish between different models of probabilistic modeling. I develop a fully Bayesian view of probabilistic modeling, but I argue that, as an alternative, Bayesian degrees of belief about the target may be derived from ontic probabilities about the source. Remarkably, some accounts of ontic probabilities can avoid problems if they are supposed to apply to sources only.