870 resultados para lock and key model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Regional flood frequency techniques are commonly used to estimate flood quantiles when flood data is unavailable or the record length at an individual gauging station is insufficient for reliable analyses. These methods compensate for limited or unavailable data by pooling data from nearby gauged sites. This requires the delineation of hydrologically homogeneous regions in which the flood regime is sufficiently similar to allow the spatial transfer of information. It is generally accepted that hydrologic similarity results from similar physiographic characteristics, and thus these characteristics can be used to delineate regions and classify ungauged sites. However, as currently practiced, the delineation is highly subjective and dependent on the similarity measures and classification techniques employed. A standardized procedure for delineation of hydrologically homogeneous regions is presented herein. Key aspects are a new statistical metric to identify physically discordant sites, and the identification of an appropriate set of physically based measures of extreme hydrological similarity. A combination of multivariate statistical techniques applied to multiple flood statistics and basin characteristics for gauging stations in the Southeastern U.S. revealed that basin slope, elevation, and soil drainage largely determine the extreme hydrological behavior of a watershed. Use of these characteristics as similarity measures in the standardized approach for region delineation yields regions which are more homogeneous and more efficient for quantile estimation at ungauged sites than those delineated using alternative physically-based procedures typically employed in practice. The proposed methods and key physical characteristics are also shown to be efficient for region delineation and quantile development in alternative areas composed of watersheds with statistically different physical composition. In addition, the use of aggregated values of key watershed characteristics was found to be sufficient for the regionalization of flood data; the added time and computational effort required to derive spatially distributed watershed variables does not increase the accuracy of quantile estimators for ungauged sites. This dissertation also presents a methodology by which flood quantile estimates in Haiti can be derived using relationships developed for data rich regions of the U.S. As currently practiced, regional flood frequency techniques can only be applied within the predefined area used for model development. However, results presented herein demonstrate that the regional flood distribution can successfully be extrapolated to areas of similar physical composition located beyond the extent of that used for model development provided differences in precipitation are accounted for and the site in question can be appropriately classified within a delineated region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Correct estimation of the firn lock-in depth is essential for correctly linking gas and ice chronologies in ice core studies. Here, two approaches to constrain the firn depth evolution in Antarctica are presented over the last deglaciation: outputs of a firn densification model, and measurements of δ15N of N2 in air trapped in ice core, assuming that δ15N is only affected by gravitational fractionation in the firn column. Since the firn densification process is largely governed by surface temperature and accumulation rate, we have investigated four ice cores drilled in coastal (Berkner Island, BI, and James Ross Island, JRI) and semi-coastal (TALDICE and EPICA Dronning Maud Land, EDML) Antarctic regions. Combined with available ice core air-δ15N measurements from the EPICA Dome C (EDC) site, the studied regions encompass a large range of surface accumulation rates and temperature conditions. Our δ15N profiles reveal a heterogeneous response of the firn structure to glacial–interglacial climatic changes. While firn densification simulations correctly predict TALDICE δ15N variations, they systematically fail to capture the large millennial-scale δ15N variations measured at BI and the δ15N glacial levels measured at JRI and EDML – a mismatch previously reported for central East Antarctic ice cores. New constraints of the EDML gas–ice depth offset during the Laschamp event (~41 ka) and the last deglaciation do not favour the hypothesis of a large convective zone within the firn as the explanation of the glacial firn model–δ15N data mismatch for this site. While we could not conduct an in-depth study of the influence of impurities in snow for firnification from the existing datasets, our detailed comparison between the δ15N profiles and firn model simulations under different temperature and accumulation rate scenarios suggests that the role of accumulation rate may have been underestimated in the current description of firnification models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of northern high-latitude peatlands played an important role in the carbon (C) balance of the land biosphere since the Last Glacial Maximum (LGM). At present, carbon storage in northern peatlands is substantial and estimated to be 500 ± 100 Pg C (1 Pg C = 1015 g C). Here, we develop and apply a peatland module embedded in a dynamic global vegetation and land surface process model (LPX-Bern 1.0). The peatland module features a dynamic nitrogen cycle, a dynamic C transfer between peatland acrotelm (upper oxic layer) and catotelm (deep anoxic layer), hydrology- and temperature-dependent respiration rates, and peatland specific plant functional types. Nitrogen limitation down-regulates average modern net primary productivity over peatlands by about half. Decadal acrotelm-to-catotelm C fluxes vary between −20 and +50 g C m−2 yr−1 over the Holocene. Key model parameters are calibrated with reconstructed peat accumulation rates from peat-core data. The model reproduces the major features of the peat core data and of the observation-based modern circumpolar soil carbon distribution. Results from a set of simulations for possible evolutions of northern peat development and areal extent show that soil C stocks in modern peatlands increased by 365–550 Pg C since the LGM, of which 175–272 Pg C accumulated between 11 and 5 kyr BP. Furthermore, our simulations suggest a persistent C sequestration rate of 35–50 Pg C per 1000 yr in present-day peatlands under current climate conditions, and that this C sink could either sustain or turn towards a source by 2100 AD depending on climate trajectories as projected for different representative greenhouse gas concentration pathways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecosystem management policies increasingly emphasize provision of multiple, as opposed to single, ecosystem services. Management for such "multifunctionality" has stimulated research into the role that biodiversity plays in providing desired rates of multiple ecosystem processes. Positive effects of biodiversity on indices of multifunctionality are consistently found, primarily because species that are redundant for one ecosystem process under a given set of environmental conditions play a distinct role under different conditions or in the provision of another ecosystem process. Here we show that the positive effects of diversity (specifically community composition) on multifunctionality indices can also arise from a statistical fallacy analogous to Simpson's paradox (where aggregating data obscures causal relationships). We manipulated soil faunal community composition in combination with nitrogen fertilization of model grassland ecosystems and repeatedly measured five ecosystem processes related to plant productivity, carbon storage, and nutrient turnover. We calculated three common multifunctionality indices based on these processes and found that the functional complexity of the soil communities had a consistent positive effect on the indices. However, only two of the five ecosystem processes also responded positively to increasing complexity, whereas the other three responded neutrally or negatively. Furthermore, none of the individual processes responded to both the complexity and the nitrogen manipulations in a manner consistent with the indices. Our data show that multifunctionality indices can obscure relationships that exist between communities and key ecosystem processes, leading us to question their use in advancing theoretical understanding-and in management decisions-about how biodiversity is related to the provision of multiple ecosystem services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tropical forests are carbon-dense and highly productive ecosystems. Consequently, they play an important role in the global carbon cycle. In the present study we used an individual-based forest model (FORMIND) to analyze the carbon balances of a tropical forest. The main processes of this model are tree growth, mortality, regeneration, and competition. Model parameters were calibrated using forest inventory data from a tropical forest at Mt. Kilimanjaro. The simulation results showed that the model successfully reproduces important characteristics of tropical forests (aboveground biomass, stem size distribution and leaf area index). The estimated aboveground biomass (385 t/ha) is comparable to biomass values in the Amazon and other tropical forests in Africa. The simulated forest reveals a gross primary production of 24 tcha-1yr-1. Modeling above- and belowground carbon stocks, we analyzed the carbon balance of the investigated tropical forest. The simulated carbon balance of this old-growth forest is zero on average. This study provides an example of how forest models can be used in combination with forest inventory data to investigate forest structure and local carbon balances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surgical robots have been proposed ex vivo to drill precise holes in the temporal bone for minimally invasive cochlear implantation. The main risk of the procedure is damage of the facial nerve due to mechanical interaction or due to temperature elevation during the drilling process. To evaluate the thermal risk of the drilling process, a simplified model is proposed which aims to enable an assessment of risk posed to the facial nerve for a given set of constant process parameters for different mastoid bone densities. The model uses the bone density distribution along the drilling trajectory in the mastoid bone to calculate a time dependent heat production function at the tip of the drill bit. Using a time dependent moving point source Green's function, the heat equation can be solved at a certain point in space so that the resulting temperatures can be calculated over time. The model was calibrated and initially verified with in vivo temperature data. The data was collected in minimally invasive robotic drilling of 12 holes in four different sheep. The sheep were anesthetized and the temperature elevations were measured with a thermocouple which was inserted in a previously drilled hole next to the planned drilling trajectory. Bone density distributions were extracted from pre-operative CT data by averaging Hounsfield values over the drill bit diameter. Post-operative [Formula: see text]CT data was used to verify the drilling accuracy of the trajectories. The comparison of measured and calculated temperatures shows a very good match for both heating and cooling phases. The average prediction error of the maximum temperature was less than 0.7 °C and the average root mean square error was approximately 0.5 °C. To analyze potential thermal damage, the model was used to calculate temperature profiles and cumulative equivalent minutes at 43 °C at a minimal distance to the facial nerve. For the selected drilling parameters, temperature elevation profiles and cumulative equivalent minutes suggest that thermal elevation of this minimally invasive cochlear implantation surgery may pose a risk to the facial nerve, especially in sclerotic or high density mastoid bones. Optimized drilling parameters need to be evaluated and the model could be used for future risk evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulating surface wind over complex terrain is a challenge in regional climate modelling. Therefore, this study aims at identifying a set-up of the Weather Research and Forecasting Model (WRF) model that minimises system- atic errors of surface winds in hindcast simulations. Major factors of the model configuration are tested to find a suitable set-up: the horizontal resolution, the planetary boundary layer (PBL) parameterisation scheme and the way the WRF is nested to the driving data set. Hence, a number of sensitivity simulations at a spatial resolution of 2 km are carried out and compared to observations. Given the importance of wind storms, the analysis is based on case studies of 24 historical wind storms that caused great economic damage in Switzerland. Each of these events is downscaled using eight different model set-ups, but sharing the same driving data set. The results show that the lack of representation of the unresolved topography leads to a general overestimation of wind speed in WRF. However, this bias can be substantially reduced by using a PBL scheme that explicitly considers the effects of non-resolved topography, which also improves the spatial structure of wind speed over Switzerland. The wind direction, although generally well reproduced, is not very sensitive to the PBL scheme. Further sensitivity tests include four types of nesting methods: nesting only at the boundaries of the outermost domain, analysis nudging, spectral nudging, and the so-called re-forecast method, where the simulation is frequently restarted. These simulations show that restricting the freedom of the model to develop large-scale disturbances slightly increases the temporal agreement with the observations, at the same time that it further reduces the overestimation of wind speed, especially for maximum wind peaks. The model performance is also evaluated in the outermost domains, where the resolution is coarser. The results demonstrate the important role of horizontal resolution, where the step from 6 to 2 km significantly improves model performance. In summary, the combination of a grid size of 2 km, the non-local PBL scheme modified to explicitly account for non-resolved orography, as well as analysis or spectral nudging, is a superior combination when dynamical downscaling is aimed at reproducing real wind fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CONTEXT Enhanced Recovery after Surgery (ERAS) programs are multimodal care pathways that aim to decrease intra-operative blood loss, decrease postoperative complications, and reduce recovery times. OBJECTIVE To overview the use and key elements of ERAS pathways, and define needs for future clinical trials. EVIDENCE ACQUISITION A comprehensive systematic MEDLINE search was performed for English language reports published before May 2015 using the terms "postoperative period," "postoperative care," "enhanced recovery after surgery," "enhanced recovery," "accelerated recovery," "fast track recovery," "recovery program," "recovery pathway", "ERAS," and "urology" or "cystectomy" or "urologic surgery." EVIDENCE SYNTHESIS We identified 18 eligible articles. Patient counseling, physical conditioning, avoiding excessive alcohol and smoking, and good nutrition appeared to protect against postoperative complications. Fasting from solid food for only 6h and perioperative liquid-carbohydrate loading up to 2h prior to surgery appeared to be safe and reduced recovery times. Restricted, balanced, and goal-directed fluid replacement is effective when individualized, depending on patient morbidity and surgical procedure. Decreased intraoperative blood loss may be achieved by several measures. Deep vein thrombosis prophylaxis, antibiotic prophylaxis, and thermoregulation were found to help reduce postsurgical complications, as was a multimodal approach to postoperative nausea, vomiting, and analgesia. Chewing gum, prokinetic agents, oral laxatives, and an early resumption to normal diet appear to aid faster return to normal bowel function. Further studies should compare anesthetic protocols, refine analgesia, and evaluate the importance of robot-assisted surgery and the need/timing for drains and catheters. CONCLUSIONS ERAS regimens are multidisciplinary, multimodal pathways that optimize postoperative recovery. PATIENT SUMMARY This review provides an overview of the use and key elements of Enhanced Recovery after Surgery programs, which are multimodal, multidisciplinary care pathways that aim to optimize postoperative recovery. Additional conclusions include identifying effective procedures within Enhanced Recovery after Surgery programs and defining needs for future clinical trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seven coral reef communities were defined on Shiraho fringing reef, Ishigaki Island, Japan. Net photosynthesis and calcification rates were measured by in situ incubations at 10 sites that included six of the defined communities, and which occupied most of the area on the reef flat and slope. Net photosynthesis on the reef flat was positive overall, but the reef flat acts as a source for atmospheric CO2, because the measured calcification/photosynthesis ratio of 2.5 is greater than the critical ratio of 1.67. Net photosynthesis on the reef slope was negative. Almost all excess organic production from the reef flat is expected to be effused to the outer reef and consumed by the communities there. Therefore, the total net organic production of the whole reef system is probably almost zero and the whole reef system also acts as a source for atmospheric CO2. Net calcification rates of the reef slope corals were much lower than those of the branching corals. The accumulation rate of the former was approximately 0.5 m kyr?1 and of the latter was ~0.7-5 m kyr?1. Consequently, reef slope corals could not grow fast enough to keep up with or catch up to rising sea levels during the Holocene. On the other hand, the branching corals grow fast enough to keep up with this rising sea level. Therefore, a transition between early Holocene and present-day reef communities is expected. Branching coral communities would have dominated while reef growth kept pace with sea level rise, and the reef was constructed with a branching coral framework. Then, the outside of this framework was covered and built up by reef slope corals and present-day reefs were constructed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relative contribution of soft bottoms to the community metabolism (primary production, respiration and net calcification) of a barrier reef flat has been investigated at Moorea (French Polynesia). Community metabolism of the sedimentary area was estimated using in situ incubations in perspex chambers, and compared with estimates of community metabolism of the whole reef flat obtained using a Lagrangian technique (Gattuso et al., 1996. Carbon flux in coral reefs. 1. Lagrangian measurement of community metabolism and resulting air-sea CO2 disequilibrium. Mar. Ecol. Prog. Ser. 145, 109-121). Net organic carbon production (E), respiration (R) and net calcification (G) of sediments were measured by seven incubations performed in triplicate at different irradiance. Respiration and environmental parameters were also measured at four randomly selected additional stations. A model of Photosynthesis-irradiance allowed to calculate oxygen (O2), organic carbon (CO2) and calcium carbonate (CaCO3) evolution from surface irradiance during a diel cycle. As chlorophyll a content of the sediment was not significantly different between stations, primary production of the sediment was considered as homogeneous for the whole lagoon. Thus, carbon production at the test station can be modelled from surface light irradiance. The modelled respiration was two times higher at the test station than the mean respiration of the barrier reef, and thus underestimated sediment contribution to excess production. Sediments cover 40-60% of the surface and accounted for 2.8-4.1% of organic carbon excess production estimated with the modelled R and 21-32% when mean R value was considered. The sedimentary CaCO3 budget was a very minor component of the whole reef budget.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coral reefs are characterized by enormous carbonate production of the organisms. It is known that rapid calcification is linked to photosynthesis under control of the carbonate equilibrium in seawater. We have established a model simulating the coexisting states of photosynthesis and calcification in order to examine the effects of photosynthesis and calcification on the carbonate system in seawater. Supposing that the rates of photosynthesis and calcification are proportional to concentrations of their inorganic carbon source, the model calculations indicate that three kinds of unique interactions of the organic and inorganic carbon productions are expected. These are photosynthetic enhancement of calcification, calcification which benefits photosynthesis and carbonate dissolution induced by respiration. The first effect appears when the photosynthetic rate is more than approximately 1.2 larger than that of calcification. This effect is caused by the increase of CO3 content and carbonate saturation degree in seawater. If photosynthesis use molecular carbon dioxide, the second effect occurs when the calcification rate is more than approximately 1.6 times larger than that of photosynthesis. Time series model experiments indicate that photosynthesis and calcification potentially enhance each other and that organic and inorganic carbon is produced more efficiently in the coexisting system than in the isolated reactions. These coexisting effects on production enhancement of photosynthesis and calcification are expected to appear not only in the internal pool of organisms but also in a reef environment which is isolated from the outer ocean during low tide. According to the measurements on the fringing type Shiraho Reef in the Ryukyu Islands, the diurnal change of water properties (pH, total alkalinity, total carbon dioxide and carbonate saturation degree) were conspicuous. This environment offers an appropriate condition for the appearance of these coexisting effects. The photosynthetic enhancement of calcification and the respiratory inducement of decalcification were observed during day-time and night-time slack-water periods, respectively. These coexisting effects, especially the photosynthetic enhancement of calcification, appear to play important roles for fluorishing coral reef communities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors are from UPM and are relatively grouped, and all have intervened in different academic or real cases on the subject, at different times as being of different age. With precedent from E. Torroja and A. Páez in Madrid Spain Safety Probabilistic models for concrete about 1957, now in ICOSSAR conferences, author J.M. Antón involved since autumn 1967 for euro-steel construction in CECM produced a math model for independent load superposition reductions, and using it a load coefficient pattern for codes in Rome Feb. 1969, practically adopted for European constructions, giving in JCSS Lisbon Feb. 1974 suggestion of union for concrete-steel-al.. That model uses model for loads like Gumbel type I, for 50 years for one type of load, reduced to 1 year to be added to other independent loads, the sum set in Gumbel theories to 50 years return period, there are parallel models. A complete reliability system was produced, including non linear effects as from buckling, phenomena considered somehow in actual Construction Eurocodes produced from Model Codes. The system was considered by author in CEB in presence of Hydraulic effects from rivers, floods, sea, in reference with actual practice. When redacting a Road Drainage Norm in MOPU Spain an optimization model was realized by authors giving a way to determine the figure of Return Period, 10 to 50 years, for the cases of hydraulic flows to be considered in road drainage. Satisfactory examples were a stream in SE of Spain with Gumbel Type I model and a paper of Ven Te Chow with Mississippi in Keokuk using Gumbel type II, and the model can be modernized with more varied extreme laws. In fact in the MOPU drainage norm the redacting commission acted also as expert to set a table of return periods for elements of road drainage, in fact as a multi-criteria complex decision system. These precedent ideas were used e.g. in wide Codes, indicated in symposia or meetings, but not published in journals in English, and a condensate of contributions of authors is presented. The authors are somehow involved in optimization for hydraulic and agro planning, and give modest hints of intended applications in presence of agro and environment planning as a selection of the criteria and utility functions involved in bayesian, multi-criteria or mixed decision systems. Modest consideration is made of changing in climate, and on the production and commercial systems, and on others as social and financial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows the results of a research aimed to formulate a general model for supporting the implementation and management of an urban road pricing scheme. After a preliminary work, to define the state of the art in the field of sustainable urban mobility strategies, the problem has been theoretically set up in terms of transport economy, introducing the external costs’ concept duly translated into the principle of pricing for the use of public infrastructures. The research is based on the definition of a set of direct and indirect indicators to qualify the urban areas by land use, mobility, environmental and economic conditions. These indicators have been calculated for a selected set of typical urban areas in Europe on the basis of the results of a survey carried out by means of a specific questionnaire. Once identified the most typical and interesting applications of the road pricing concept in cities such as London (Congestion Charging), Milan (Ecopass), Stockholm (Congestion Tax) and Rome (ZTL), a large benchmarking exercise and the cross analysis of direct and indirect indicators, has allowed to define a simple general model, guidelines and key requirements for the implementation of a pricing scheme based traffic restriction in a generic urban area. The model has been finally applied to the design of a road pricing scheme for a particular area in Madrid, and to the quantification of the expected results of its implementation from a land use, mobility, environmental and economic perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a behavioral-analytical hybrid loss model for a buck converter. The model has been designed for a wide operating frequency range up to 4MHz and a low power range (below 20W). It is focused on the switching losses obtained in the power MOSFETs. Main advantages of the model are the fast calculation time and a good accuracy. It has been validated by simulation and experimentally with one Ga, power transistor and two Si MOSFETs. Results show good agreement between measurements and the model.