892 resultados para Energy consumption -- Computer simulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

[spa] En este artículo aplicamos un modelo input-output ampliado medioambientalmente para analizar un aspecto específico de la hipótesis de la curva de Kuznets ambiental. El propósito del estudio es analizar si las estructuras de consumo de los hogares con una mejor ‘posición económica’ pueden tener un efecto positivo para reducir las presiones medioambientales. Para ello combinamos información de diferentes bases de datos para analizar el impacto de la contaminación atmosférica del consumo de diferentes hogares españoles en el año 2000. Consideramos nueve gases, i.e. los seis gases de efecto invernadero (CO2, CH4, N2O, SF6, HFCs, y PFCs) y otros tres gases (SO2, NOx, y NH3). Clasificamos los hogares en quintiles de gasto per capita y quintiles de gasto equivalente. Los resultados obtenidos muestran que hay una relación positiva y elevada entre el nivel de gasto y las emisiones directas e indirectas generadas por el consumo de los hogares; sin embargo, las intensidades de emisión tienden a disminuir con el nivel de gasto para los diferentes gases, con la excepción de SF6, HFCs, y PFCs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gel electrophoresis allows one to separate knotted DNA (nicked circular) of equal length according to the knot type. At low electric fields, complex knots, being more compact, drift faster than simpler knots. Recent experiments have shown that the drift velocity dependence on the knot type is inverted when changing from low to high electric fields. We present a computer simulation on a lattice of a closed, knotted, charged DNA chain drifting in an external electric field in a topologically restricted medium. Using a Monte Carlo algorithm, the dependence of the electrophoretic migration of the DNA molecules on the knot type and on the electric field intensity is investigated. The results are in qualitative and quantitative agreement with electrophoretic experiments done under conditions of low and high electric fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[spa] En este artículo aplicamos un modelo input-output ampliado medioambientalmente para analizar un aspecto específico de la hipótesis de la curva de Kuznets ambiental. El propósito del estudio es analizar si las estructuras de consumo de los hogares con una mejor ‘posición económica’ pueden tener un efecto positivo para reducir las presiones medioambientales. Para ello combinamos información de diferentes bases de datos para analizar el impacto de la contaminación atmosférica del consumo de diferentes hogares españoles en el año 2000. Consideramos nueve gases, i.e. los seis gases de efecto invernadero (CO2, CH4, N2O, SF6, HFCs, y PFCs) y otros tres gases (SO2, NOx, y NH3). Clasificamos los hogares en quintiles de gasto per capita y quintiles de gasto equivalente. Los resultados obtenidos muestran que hay una relación positiva y elevada entre el nivel de gasto y las emisiones directas e indirectas generadas por el consumo de los hogares; sin embargo, las intensidades de emisión tienden a disminuir con el nivel de gasto para los diferentes gases, con la excepción de SF6, HFCs, y PFCs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In its 2007 Session, the Iowa General Assembly passed, and Governor Culver signed into law, extensive and far-reaching state energy policy legislation. This legislation created the Iowa Office of Energy Independence and the Iowa Power Fund. It also required a report to be issued each year detailing: • The historical use and distribution of energy in Iowa. • The growth rate of energy consumption in Iowa, including rates of growth for each energy source. • A projection of Iowa’s energy needs through the year 2025 at a minimum. • The impact of meeting Iowa’s energy needs on the economy of the state, including the impact of energy production and use on greenhouse gas emissions. • An evaluation of renewable energy sources, including the current and future technological potential for such sources. Much of the energy information for this report has been derived from the on-line resources of the Energy Information Administration (EIA) of the United States Department of Energy (USDOE). The EIA provides policy-independent data, forecasts and analyses on energy production, stored supplies, consumption and prices. For complete, economy-wide information, the most recent data available is for the year 2008. For some energy sectors, more current data is available from EIA and other sources and, when available, such information has been included in this report.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In mammography, the image contrast and dose delivered to the patient are determined by the x-ray spectrum and the scatter to primary ratio S/P. Thus the quality of the mammographic procedure is highly dependent on the choice of anode and filter material and on the method used to reduce the amount of scattered radiation reaching the detector. Synchrotron radiation is a useful tool to study the effect of beam energy on the optimization of the mammographic process because it delivers a high flux of monochromatic photons. Moreover, because the beam is naturally flat collimated in one direction, a slot can be used instead of a grid for scatter reduction. We have measured the ratio S/P and the transmission factors for grids and slots for monoenergetic synchrotron radiation. In this way the effect of beam energy and scatter rejection method were separated, and their respective importance for image quality and dose analyzed. Our results show that conventional mammographic spectra are not far from optimum and that the use of a slot instead of a grid has an important effect on the optimization of the mammographic process. We propose a simple numerical model to quantify this effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate prediction of transcription factor binding sites is needed to unravel the function and regulation of genes discovered in genome sequencing projects. To evaluate current computer prediction tools, we have begun a systematic study of the sequence-specific DNA-binding of a transcription factor belonging to the CTF/NFI family. Using a systematic collection of rationally designed oligonucleotides combined with an in vitro DNA binding assay, we found that the sequence specificity of this protein cannot be represented by a simple consensus sequence or weight matrix. For instance, CTF/NFI uses a flexible DNA binding mode that allows for variations of the binding site length. From the experimental data, we derived a novel prediction method using a generalised profile as a binding site predictor. Experimental evaluation of the generalised profile indicated that it accurately predicts the binding affinity of the transcription factor to natural or synthetic DNA sequences. Furthermore, the in vitro measured binding affinities of a subset of oligonucleotides were found to correlate with their transcriptional activities in transfected cells. The combined computational-experimental approach exemplified in this work thus resulted in an accurate prediction method for CTF/NFI binding sites potentially functioning as regulatory regions in vivo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pharmacokinetic variability in drug levels represent for some drugs a major determinant of treatment success, since sub-therapeutic concentrations might lead to toxic reactions, treatment discontinuation or inefficacy. This is true for most antiretroviral drugs, which exhibit high inter-patient variability in their pharmacokinetics that has been partially explained by some genetic and non-genetic factors. The population pharmacokinetic approach represents a very useful tool for the description of the dose-concentration relationship, the quantification of variability in the target population of patients and the identification of influencing factors. It can thus be used to make predictions and dosage adjustment optimization based on Bayesian therapeutic drug monitoring (TDM). This approach has been used to characterize the pharmacokinetics of nevirapine (NVP) in 137 HIV-positive patients followed within the frame of a TDM program. Among tested covariates, body weight, co-administration of a cytochrome (CYP) 3A4 inducer or boosted atazanavir as well as elevated aspartate transaminases showed an effect on NVP elimination. In addition, genetic polymorphism in the CYP2B6 was associated with reduced NVP clearance. Altogether, these factors could explain 26% in NVP variability. Model-based simulations were used to compare the adequacy of different dosage regimens in relation to the therapeutic target associated with treatment efficacy. In conclusion, the population approach is very useful to characterize the pharmacokinetic profile of drugs in a population of interest. The quantification and the identification of the sources of variability is a rational approach to making optimal dosage decision for certain drugs administered chronically.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a computer simulation study of the ion binding process at an ionizable surface using a semi-grand canonical Monte Carlo method that models the surface as a discrete distribution of charged and neutral functional groups in equilibrium with explicit ions modelled in the context of the primitive model. The parameters of the simulation model were tuned and checked by comparison with experimental titrations of carboxylated latex particles in the presence of different ionic strengths of monovalent ions. The titration of these particles was analysed by calculating the degree of dissociation of the latex functional groups vs. pH curves at different background salt concentrations. As the charge of the titrated surface changes during the simulation, a procedure to keep the electroneutrality of the system is required. Here, two approaches are used with the choice depending on the ion selected to maintain electroneutrality: counterion or coion procedures. We compare and discuss the difference between the procedures. The simulations also provided a microscopic description of the electrostatic double layer (EDL) structure as a function of p H and ionic strength. The results allow us to quantify the effect of the size of the background salt ions and of the surface functional groups on the degree of dissociation. The non-homogeneous structure of the EDL was revealed by plotting the counterion density profiles around charged and neutral surface functional groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this computerized simulation model is to provide an estimate of the number of beds used by a population, taking into accounts important determining factors. These factors are demographic data of the deserved population, hospitalization rates, hospital case-mix and length of stay; these parameters can be taken either from observed data or from scenarii. As an example, the projected evolution of the number of beds in Canton Vaud for the period 1893-2010 is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mediante el uso de la herramienta de la huella de carbono se busca saber el impacto ambiental de la vida cotidiana de las clases de renta alta de ciertos barrios seleccionados del Área Metropolitana de Concepción. Concretamente se estudiará la huella de carbono de la movilidad y el consumo energético en las viviendas. La extracción de información se basa en el uso de encuestas casa por casa y un posterior análisis informático de los datos obtenidos. Finalmente se propondrán medidas para reducir la huella de carbono en el caso que esta exceda los valores considerados sostenibles a largo plazo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the detection and management of osteoporosis and osteoporosis-related fractures, quantitative ultrasound (QUS) is emerging as a relatively low-cost and readily accessible alternative to dual-energy X-ray absorptiometry (DXA) measurement of bone mineral density (BMD) in certain circumstances. The following is a brief, but thorough review of the existing literature with respect to the use of QUS in 6 settings: 1) assessing fragility fracture risk; 2) diagnosing osteoporosis; 3) initiating osteoporosis treatment; 4) monitoring osteoporosis treatment; 5) osteoporosis case finding; and 6) quality assurance and control. Many QUS devices exist that are quite different with respect to the parameters they measure and the strength of empirical evidence supporting their use. In general, heel QUS appears to be most tested and most effective. Overall, some, but not all, heel QUS devices are effective assessing fracture risk in some, but not all, populations, the evidence being strongest for Caucasian females over 55 years old. Otherwise, the evidence is fair with respect to certain devices allowing for the accurate diagnosis of likelihood of osteoporosis, and generally fair to poor in terms of QUS use when initiating or monitoring osteoporosis treatment. A reasonable protocol is proposed herein for case-finding purposes, which relies on a combined assessment of clinical risk factors (CR.F) and heel QUS. Finally, several recommendations are made for quality assurance and control.