817 resultados para Best-case scenario


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is huge knowledge gap in our understanding of many terrestrial carbon cycle processes. In this paper, we investigate the bounds on terrestrial carbon uptake over India that arises solely due to CO (2) -fertilization. For this purpose, we use a terrestrial carbon cycle model and consider two extreme scenarios: unlimited CO2-fertilization is allowed for the terrestrial vegetation with CO2 concentration level at 735 ppm in one case, and CO2-fertilization is capped at year 1975 levels for another simulation. Our simulations show that, under equilibrium conditions, modeled carbon stocks in natural potential vegetation increase by 17 Gt-C with unlimited fertilization for CO2 levels and climate change corresponding to the end of 21st century but they decline by 5.5 Gt-C if fertilization is limited at 1975 levels of CO2 concentration. The carbon stock changes are dominated by forests. The area covered by natural potential forests increases by about 36% in the unlimited fertilization case but decreases by 15% in the fertilization-capped case. Thus, the assumption regarding CO2-fertilization has the potential to alter the sign of terrestrial carbon uptake over India. Our model simulations also imply that the maximum potential terrestrial sequestration over India, under equilibrium conditions and best case scenario of unlimited CO2-fertilization, is only 18% of the 21st century SRES A2 scenarios emissions from India. The limited uptake potential of the natural potential vegetation suggests that reduction of CO2 emissions and afforestation programs should be top priorities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study has established that the use of a computer model, the Anaerobic Digestion Model 1, is suitable for investigation of the stability and energy balance of the anaerobic digestion of food waste. In simulations, digestion of undiluted food waste was less stable than that of sewage sludge or mixtures of the two, but gave much higher average methane yields per volume of digester. In the best case scenario simulations, food waste resulted in the production of 5.3 Nm3 of methane per day per m3 of digester volume, much higher than that of sewage sludge alone at 1.1 Nm3 of methane per day per m3. There was no substantial difference in the yield per volatile solids added. Food waste, however, did not sustain a stable digestion if its cation content was below a certain level. Mixing food waste and sewage sludge allowed digestion with a lower cation content. The changes in composition of food waste feedstock caused great variation in biogas output and even more so volatile fatty acid concentration, which lowered the digestion stability. Modelling anaerobic digestion allowed simulation of failure scenarios and gave insights into the importance of the cation/anion balance and the magnitude of variability in feedstocks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Paper published in PLoS Medicine in 2007.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Side-channel analysis of cryptographic systems can allow for the recovery of secret information by an adversary even where the underlying algorithms have been shown to be provably secure. This is achieved by exploiting the unintentional leakages inherent in the underlying implementation of the algorithm in software or hardware. Within this field of research, a class of attacks known as profiling attacks, or more specifically as used here template attacks, have been shown to be extremely efficient at extracting secret keys. Template attacks assume a strong adversarial model, in that an attacker has an identical device with which to profile the power consumption of various operations. This can then be used to efficiently attack the target device. Inherent in this assumption is that the power consumption across the devices under test is somewhat similar. This central tenet of the attack is largely unexplored in the literature with the research community generally performing the profiling stage on the same device as being attacked. This is beneficial for evaluation or penetration testing as it is essentially the best case scenario for an attacker where the model built during the profiling stage matches exactly that of the target device, however it is not necessarily a reflection on how the attack will work in reality. In this work, a large scale evaluation of this assumption is performed, comparing the key recovery performance across 20 identical smart-cards when performing a profiling attack.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les gènes sont les parties du génome qui codent pour les protéines. Les gènes d’une ou plusieurs espèces peuvent être regroupés en "familles", en fonction de leur similarité de séquence. Cependant, pour connaître les relations fonctionnelles entre ces copies de gènes, la similarité de séquence ne suffit pas. Pour cela, il est important d’étudier l’évolution d’une famille par duplications et pertes afin de pouvoir distinguer entre gènes orthologues, des copies ayant évolué par spéciation et susceptibles d’avoir conservé une fonction commune, et gènes paralogues, des copies ayant évolué par duplication qui ont probablement développé des nouvelles fonctions. Étant donnée une famille de gènes présents dans n espèces différentes, un arbre de gènes (obtenu par une méthode phylogénétique classique), et un arbre phylogénétique pour les n espèces, la "réconciliation" est l’approche la plus courante permettant d’inférer une histoire d’évolution de cette famille par duplications, spéciations et pertes. Le degré de confiance accordé à l’histoire inférée est directement relié au degré de confiance accordé à l’arbre de gènes lui-même. Il est donc important de disposer d’une méthode préliminaire de correction d’arbres de gènes. Ce travail introduit une méthodologie permettant de "corriger" un arbre de gènes : supprimer le minimum de feuilles "mal placées" afin d’obtenir un arbre dont les sommets de duplications (inférés par la réconciliation) sont tous des sommets de "duplications apparentes" et obtenir ainsi un arbre de gènes en "accord" avec la phylogénie des espèces. J’introduis un algorithme exact pour des arbres d’une certaine classe, et une heuristique pour le cas général.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabajo pretende explorar el desarrollo del sector de la telefonía móvil desde sus inicios hasta la actualidad en Colombia, con el fin de generar escenarios de futuro. Las herramientas prospectivas MicMac (Análisis Estructural Prospectivo), Smic (Sistema de Matrices de Impactos Cruzados) y la opinión de expertos líderes del sector, son la base principal para el desarrollo del trabajo. Las entidades gubernamentales, la CTR (Comisión de Regulación de Telecomunicaciones), y los líderes de los operadores del sector de telefonía móvil, entre otros, se han concientizado que la innovación es la base del éxito en este tipo de organizaciones y por eso se ha trabajado en mejorar su regulación, logrando de esta manera que el desarrollo de los productos y servicios que se ofrecen sean cada vez mejores y perjudique en menor medida al medio ambiente y a los usuarios. Este subsector de las telecomunicaciones, es el más dinámico y con mayor potencial. Sin embargo, este también es afectado por las condiciones económicas del mercado, la inestabilidad política, las importaciones y exportaciones derivadas de los tratados comerciales, entre otros temas. El escenario apuesta facilitaría la prestación de productos con tecnología de punta y servicios con la mejor cobertura y acceso posible a precios bajos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation discusses structural-electrostatic modeling techniques, genetic algorithm based optimization and control design for electrostatic micro devices. First, an alternative modeling technique, the interpolated force model, for electrostatic micro devices is discussed. The method provides improved computational efficiency relative to a benchmark model, as well as improved accuracy for irregular electrode configurations relative to a common approximate model, the parallel plate approximation model. For the configuration most similar to two parallel plates, expected to be the best case scenario for the approximate model, both the parallel plate approximation model and the interpolated force model maintained less than 2.2% error in static deflection compared to the benchmark model. For the configuration expected to be the worst case scenario for the parallel plate approximation model, the interpolated force model maintained less than 2.9% error in static deflection while the parallel plate approximation model is incapable of handling the configuration. Second, genetic algorithm based optimization is shown to improve the design of an electrostatic micro sensor. The design space is enlarged from published design spaces to include the configuration of both sensing and actuation electrodes, material distribution, actuation voltage and other geometric dimensions. For a small population, the design was improved by approximately a factor of 6 over 15 generations to a fitness value of 3.2 fF. For a larger population seeded with the best configurations of the previous optimization, the design was improved by another 7% in 5 generations to a fitness value of 3.0 fF. Third, a learning control algorithm is presented that reduces the closing time of a radiofrequency microelectromechanical systems switch by minimizing bounce while maintaining robustness to fabrication variability. Electrostatic actuation of the plate causes pull-in with high impact velocities, which are difficult to control due to parameter variations from part to part. A single degree-of-freedom model was utilized to design a learning control algorithm that shapes the actuation voltage based on the open/closed state of the switch. Experiments on 3 test switches show that after 5-10 iterations, the learning algorithm lands the switch with an impact velocity not exceeding 0.2 m/s, eliminating bounce.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is composed of three life-cycle analysis (LCA) studies of manufacturing to determine cumulative energy demand (CED) and greenhouse gas emissions (GHG). The methods proposed could reduce the environmental impact by reducing the CED in three manufacturing processes. First, industrial symbiosis is proposed and a LCA is performed on both conventional 1 GW-scaled hydrogenated amorphous silicon (a-Si:H)-based single junction and a-Si:H/microcrystalline-Si:H tandem cell solar PV manufacturing plants and such plants coupled to silane recycling plants. Using a recycling process that results in a silane loss of only 17 versus 85 percent, this results in a CED savings of 81,700 GJ and 290,000 GJ per year for single and tandem junction plants, respectively. This recycling process reduces the cost of raw silane by 68 percent, or approximately $22.6 and $79 million per year for a single and tandem 1 GW PV production facility, respectively. The results show environmental benefits of silane recycling centered around a-Si:H-based PV manufacturing plants. Second, an open-source self-replicating rapid prototype or 3-D printer, the RepRap, has the potential to reduce the environmental impact of manufacturing of polymer-based products, using distributed manufacturing paradigm, which is further minimized by the use of PV and improvements in PV manufacturing. Using 3-D printers for manufacturing provides the ability to ultra-customize products and to change fill composition, which increases material efficiency. An LCA was performed on three polymer-based products to determine the CED and GHG from conventional large-scale production and are compared to experimental measurements on a RepRap producing identical products with ABS and PLA. The results of this LCA study indicate that the CED of manufacturing polymer products can possibly be reduced using distributed manufacturing with existing 3-D printers under 89% fill and reduced even further with a solar photovoltaic system. The results indicate that the ability of RepRaps to vary fill has the potential to diminish environmental impact on many products. Third, one additional way to improve the environmental performance of this distributed manufacturing system is to create the polymer filament feedstock for 3-D printers using post-consumer plastic bottles. An LCA was performed on the recycling of high density polyethylene (HDPE) using the RecycleBot. The results of the LCA showed that distributed recycling has a lower CED than the best-case scenario used for centralized recycling. If this process is applied to the HDPE currently recycled in the U.S., more than 100 million MJ of energy could be conserved per annum along with significant reductions in GHG. This presents a novel path to a future of distributed manufacturing suited for both the developed and developing world with reduced environmental impact. From improving manufacturing in the photovoltaic industry with the use of recycling to recycling and manufacturing plastic products within our own homes, each step reduces the impact on the environment. The three coupled projects presented here show a clear potential to reduce the environmental impact of manufacturing and other processes by implementing complimenting systems, which have environmental benefits of their own in order to achieve a compounding effect of reduced CED and GHG.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This cross-sectional study was undertaken to evaluate the impact in terms of HIV/STD knowledge and sexual behavior that the City of Houston HIV/STD prevention program in HISD high schools has had on students who have participated in it by comparing them with their peers who have not, based on self reports. The study further evaluated the program cost-effectiveness for averting future HIV infections by computing Cost-Utility Ratios based on reported sexual behavior. ^ Mixed results were obtained, indicating a statistically significant difference in knowledge with the intervention group having scored higher (p-value 0.001) but not for any of the behaviors assessed. The knowledge score outcome's overall p-value after adjusting for each stratifying variable (age, grade, gender and ethnicity) was statistically significant. The Odds Ratio of intervention group participants aged 15 years or more scoring 70% or higher was 1.86 times; that of intervention group female participants was 2.29 times; and that of intervention group Black/African American participants was 2.47 times relative to their comparison group counterparts. The knowledge score results remained statistically significant in the logistic regression model, which controlled for age, grade level, gender and ethnicity. The Odds Ratio in this case was 1.74. ^ Three scenarios based on the difference in the risk of HIV infection between the intervention and comparison group were used for computation of Cost-Utility Ratios: Base, worst and best-case scenario. The best-case scenario yielded cost-effective results for male participants and cost-saving results for female participants when using ethnicity-adjusted HIV prevalence. The scenario remained cost-effective for female participants when using the unadjusted HIV prevalence. ^ The challenge to the program is to devise approaches that can enhance benefits for male participants. If it is a threshold problem implying that male participants require more intensive programs for behavioral change, then programs should first be piloted among boys before being implemented across the board. If it is a reflection of gender differences, then we might have to go back to the drawing board and engage boys in focus group discussions that will help formulate more effective programs. Gender-blind approaches currently in vogue do not seem to be working. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maximizing data quality may be especially difficult in trauma-related clinical research. Strategies are needed to improve data quality and assess the impact of data quality on clinical predictive models. This study had two objectives. The first was to compare missing data between two multi-center trauma transfusion studies: a retrospective study (RS) using medical chart data with minimal data quality review and the PRospective Observational Multi-center Major Trauma Transfusion (PROMMTT) study with standardized quality assurance. The second objective was to assess the impact of missing data on clinical prediction algorithms by evaluating blood transfusion prediction models using PROMMTT data. RS (2005-06) and PROMMTT (2009-10) investigated trauma patients receiving ≥ 1 unit of red blood cells (RBC) from ten Level I trauma centers. Missing data were compared for 33 variables collected in both studies using mixed effects logistic regression (including random intercepts for study site). Massive transfusion (MT) patients received ≥ 10 RBC units within 24h of admission. Correct classification percentages for three MT prediction models were evaluated using complete case analysis and multiple imputation based on the multivariate normal distribution. A sensitivity analysis for missing data was conducted to estimate the upper and lower bounds of correct classification using assumptions about missing data under best and worst case scenarios. Most variables (17/33=52%) had <1% missing data in RS and PROMMTT. Of the remaining variables, 50% demonstrated less missingness in PROMMTT, 25% had less missingness in RS, and 25% were similar between studies. Missing percentages for MT prediction variables in PROMMTT ranged from 2.2% (heart rate) to 45% (respiratory rate). For variables missing >1%, study site was associated with missingness (all p≤0.021). Survival time predicted missingness for 50% of RS and 60% of PROMMTT variables. MT models complete case proportions ranged from 41% to 88%. Complete case analysis and multiple imputation demonstrated similar correct classification results. Sensitivity analysis upper-lower bound ranges for the three MT models were 59-63%, 36-46%, and 46-58%. Prospective collection of ten-fold more variables with data quality assurance reduced overall missing data. Study site and patient survival were associated with missingness, suggesting that data were not missing completely at random, and complete case analysis may lead to biased results. Evaluating clinical prediction model accuracy may be misleading in the presence of missing data, especially with many predictor variables. The proposed sensitivity analysis estimating correct classification under upper (best case scenario)/lower (worst case scenario) bounds may be more informative than multiple imputation, which provided results similar to complete case analysis.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Philippe de Schoutheete takes as his point of departure in this Commentary the assumption that institutional treaty change cannot be a priority, although he does not exclude that it may become possible and desirable at a later period of economic growth and greater self-confidence in public opinion. In a best-case scenario, he foresees that such a window of opportunity might open towards the end of the present legislature. But in the meantime, he advises concentrating attention on adapting the institutions to make them work better and work more effectively together.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first step in conservation planning is to identify objectives. Most stated objectives for conservation, such as to maximize biodiversity outcomes, are too vague to be useful within a decision-making framework. One way to clarify the issue is to define objectives in terms of the risk of extinction for multiple species. Although the assessment of extinction risk for single species is common, few researchers have formulated an objective function that combines the extinction risks of multiple species. We sought to translate the broad goal of maximizing the viability of species into explicit objectives for use in a decision-theoretic approach to conservation planning. We formulated several objective functions based on extinction risk across many species and illustrated the differences between these objectives with simple examples. Each objective function was the mathematical representation of an approach to conservation and emphasized different levels of threat Our objectives included minimizing the joint probability of one or more extinctions, minimizing the expected number of extinctions, and minimizing the increase in risk of extinction from the best-case scenario. With objective functions based on joint probabilities of extinction across species, any correlations in extinction probabilities bad to be known or the resultant decisions were potentially misleading. Additive objectives, such as the expected number of extinctions, did not produce the same anomalies. We demonstrated that the choice of objective function is central to the decision-making process because alternative objective functions can lead to a different ranking of management options. Therefore, decision makers need to think carefully in selecting and defining their conservation goals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we experimentally demonstrate the seamless integration of full duplex system frequency division duplex (FDD) long-term evolution (LTE) technology with radio over fiber (RoF) for eNodeB (eNB) coverage extension. LTE is composed of quadrature phase-shift keying (QPSK), 16-quadrature amplitude modulation (16-QAM) and 64-QAM, modulated onto orthogonal frequency division multiplexing (OFDM) and single-carrier-frequency division multiplexing for downlink (DL) and uplink (UL) transmissions, respectively. The RoF system is composed of dedicated directly modulated lasers for DL and UL with dense wavelength division multiplexing (DWDM) for instantaneous connections and for Rayleigh backscattering and nonlinear interference mitigation. DL and UL signals have varying carrier frequencies and are categorized as broad frequency spacing (BFS), intermediate frequency spacing (IFS), and narrow frequency spacing (NFS). The adjacent channel leakage ratio (ACLR) for DL and UL with 64-QAM are similar for all frequency spacings while cross talk is observed for NFS. For the best case scenario for DL and UL transmissions we achieve error vector magnitude (EVM) values of ~2.30%, ~2.33%, and ~2.39% for QPSK, 16-QAM, and 64-QAM, respectively, while for the worst case scenario with a NFS EVM is increased by 0.40% for all schemes. © 2009-2012 OSA.