980 resultados para Conjectural Variations Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the dynamic response of a hydro power plant for providing secondary regulation reserve is studied in detail. S pecial emphasis is given to the elastic water column effects both in the penstock and the tailrace tunnel. For this purpose, a nonline ar model based on the analogy between mass and momentum conservation equations of a water conduit and those of wave propagation in transmission lines is used. The influence of the plant configuration and design parameters on the fulfilment of the Spanish Electrical System Operator requirem ents is analysed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last years cities around the world have invested important quantities of money in measures for reducing congestion and car-trips. Investments which are nothing but potential solutions for the well-known urban sprawl phenomenon, also called the “development trap” that leads to further congestion and a higher proportion of our time spent in slow moving cars. Over the path of this searching for solutions, the complex relationship between urban environment and travel behaviour has been studied in a number of cases. The main question on discussion is, how to encourage multi-stop tours? Thus, the objective of this paper is to verify whether unobserved factors influence tour complexity. For this purpose, we use a data-base from a survey conducted in 2006-2007 in Madrid, a suitable case study for analyzing urban sprawl due to new urban developments and substantial changes in mobility patterns in the last years. A total of 943 individuals were interviewed from 3 selected neighbourhoods (CBD, urban and suburban). We study the effect of unobserved factors on trip frequency. This paper present the estimation of an hybrid model where the latent variable is called propensity to travel and the discrete choice model is composed by 5 alternatives of tour type. The results show that characteristics of the neighbourhoods in Madrid are important to explain trip frequency. The influence of land use variables on trip generation is clear and in particular the presence of commercial retails. Through estimation of elasticities and forecasting we determine to what extent land-use policy measures modify travel demand. Comparing aggregate elasticities with percentage variations, it can be seen that percentage variations could lead to inconsistent results. The result shows that hybrid models better explain travel behavior than traditional discrete choice models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Core competencies form the basis of an organization’s skills and the basic element of a successful strategic execution. Identifying and strengthening the core competencies enhances flexibility thereby strategically positioning a firm for responding to competition in the dynamic marketplace and can be the difference in quality among firms that follow the same business model. A correct understanding of the concept of business models, employing the right core competencies, organizing them effectively and building the business model around the competencies that are constantly gained and assimilated can result in enhanced business performance and thus having implications for firms that want to innovate their business models. Flexibility can be the firm’s agility to shift focus in response to external factors such as changing markets, new technologies or competition and a firm’s success can be gauged by the ability it displays in this transition. Although industry transformations generally emanate from technological changes, recent examples suggests they may also be due to the introduction of new business models and nowhere is it more relevant than in the airline industry. An analysis of the business model flexibility of 17 Airlines from Asia, Europe and Oceania, that is done with core competence as the indicator reveals a picture of inconsistencies in the core competence strategy of certain airlines and the corresponding reduction in business performance. The performance variations are explained from a service oriented core competence strategy employed by airlines that ultimately enables them in having a flexible business model that not only increases business performance but also helps in reducing the uncertainties in the internal and external operating environments. This is more relevant in the case of airline industry, as the product (the air transportation of passengers) minus the service competence is all the same.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentación realizada en el PhD Seminar del ITS 2011 en Budapest. ICTs (Information and Communication Technologies) currently account for 2% of total carbon emissions. However, although modern standards require strict measures to reduce energy consumption across all industrial and services sectors, the ICT sector also faces an increase in services and bandwidth demand. The deployment of Next Generation Networks (NGN) will be the answer to this new demand; more specifically, Next Generation Access Networks (NGANs) will provide higher bandwidth access to users. Several policy and cost analyses are being carried out to understand the risks and opportunities of new deployments, but the question of what role energy consumption plays in NGANs seems off the table. Thus, this paper proposes a model to analyse the energy consumption of the main fibre-based NGAN architectures: Fibre To The House (FTTH), in both Passive Optical Network (PON) and Point-to-Point (PtP) variations, and FTTx/VDSL. The aim of this analysis is to provide deeper insight on the impact of new deployments on the energy consumption of the ICT sector and the effects of energy consumption on the life-cycle cost of NGANs. The paper also presents an energy consumption comparison of the presented architectures, particularised to the specific geographic and demographic distribution of users of Spain but easily extendable to other countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so called shared resources), rather than on the specific constructs of some programming language. Using a resource-centric rather than a language-centric approach has some benefits for both teachers and students. Besides the obvious advantage of being independent of the programming language, the models help in the early validation of concurrent software design, provide students and teachers with a lingua franca that greatly simplifies communication at the classroom and during supervision, and help in the automatic generation of tests for the practical assignments. This method has been in use, with slight variations, for some 15 years, surviving changes in the programming language and course length. In this article, we describe the components and structure of the current incarnation of the course?which uses Java as target language?and some tools used to support our method. We provide a detailed description of the different outcomes that the model-driven approach delivers (validation of the initial design, automatic generation of tests, and mechanical generation of code) from a teaching perspective. A critical discussion on the perceived advantages and risks of our approach follows, including some proposals on how these risks can be minimized. We include a statistical analysis to show that our method has a positive impact in the student ability to understand concurrency and to generate correct code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hansbreen is a tidewater glacier in Svalbard, with grounded tongue, about 16 km in length and ca. 2.5 km in width at its tongue. The calving front position has shown, over the recent decades, a general retreating trend, often rather smooth but with some occasional abrupt changes. We apply a full-Stokes model of glacier dynamics, incorporating a crevasse-depth calving model, with the aim of reproducing the glacier front positions observed since 1936 and analyzing the sensitivity of the model to environmental parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tesis trata sobre métodos de corrección que compensan la variación de las condiciones de iluminación en aplicaciones de imagen y video a color. Estas variaciones hacen que a menudo fallen aquellos algoritmos de visión artificial que utilizan características de color para describir los objetos. Se formulan tres preguntas de investigación que definen el marco de trabajo de esta tesis. La primera cuestión aborda las similitudes que se dan entre las imágenes de superficies adyacentes en relación a su comportamiento fotométrico. En base al análisis del modelo de formación de imágenes en situaciones dinámicas, esta tesis propone un modelo capaz de predecir las variaciones de color de la región de una determinada imagen a partir de las variaciones de las regiones colindantes. Dicho modelo se denomina Quotient Relational Model of Regions. Este modelo es válido cuando: las fuentes de luz iluminan todas las superficies incluídas en él; estas superficies están próximas entre sí y tienen orientaciones similares; y cuando son en su mayoría lambertianas. Bajo ciertas circunstancias, la respuesta fotométrica de una región se puede relacionar con el resto mediante una combinación lineal. No se ha podido encontrar en la literatura científica ningún trabajo previo que proponga este tipo de modelo relacional. La segunda cuestión va un paso más allá y se pregunta si estas similitudes se pueden utilizar para corregir variaciones fotométricas desconocidas en una región también desconocida, a partir de regiones conocidas adyacentes. Para ello, se propone un método llamado Linear Correction Mapping capaz de dar una respuesta afirmativa a esta cuestión bajo las circunstancias caracterizadas previamente. Para calcular los parámetros del modelo se requiere una etapa de entrenamiento previo. El método, que inicialmente funciona para una sola cámara, se amplía para funcionar en arquitecturas con varias cámaras sin solape entre sus campos visuales. Para ello, tan solo se necesitan varias muestras de imágenes del mismo objeto capturadas por todas las cámaras. Además, este método tiene en cuenta tanto las variaciones de iluminación, como los cambios en los parámetros de exposición de las cámaras. Todos los métodos de corrección de imagen fallan cuando la imagen del objeto que tiene que ser corregido está sobreexpuesta o cuando su relación señal a ruido es muy baja. Así, la tercera cuestión se refiere a si se puede establecer un proceso de control de la adquisición que permita obtener una exposición óptima cuando las condiciones de iluminación no están controladas. De este modo, se propone un método denominado Camera Exposure Control capaz de mantener una exposición adecuada siempre y cuando las variaciones de iluminación puedan recogerse dentro del margen dinámico de la cámara. Los métodos propuestos se evaluaron individualmente. La metodología llevada a cabo en los experimentos consistió en, primero, seleccionar algunos escenarios que cubrieran situaciones representativas donde los métodos fueran válidos teóricamente. El Linear Correction Mapping fue validado en tres aplicaciones de re-identificación de objetos (vehículos, caras y personas) que utilizaban como caracterísiticas la distribución de color de éstos. Por otra parte, el Camera Exposure Control se probó en un parking al aire libre. Además de esto, se definieron varios indicadores que permitieron comparar objetivamente los resultados de los métodos propuestos con otros métodos relevantes de corrección y auto exposición referidos en el estado del arte. Los resultados de la evaluación demostraron que los métodos propuestos mejoran los métodos comparados en la mayoría de las situaciones. Basándose en los resultados obtenidos, se puede decir que las respuestas a las preguntas de investigación planteadas son afirmativas, aunque en circunstancias limitadas. Esto quiere decir que, las hipótesis planteadas respecto a la predicción, la corrección basada en ésta y la auto exposición, son factibles en aquellas situaciones identificadas a lo largo de la tesis pero que, sin embargo, no se puede garantizar que se cumplan de manera general. Por otra parte, se señalan como trabajo de investigación futuro algunas cuestiones nuevas y retos científicos que aparecen a partir del trabajo presentado en esta tesis. ABSTRACT This thesis discusses the correction methods used to compensate the variation of lighting conditions in colour image and video applications. These variations are such that Computer Vision algorithms that use colour features to describe objects mostly fail. Three research questions are formulated that define the framework of the thesis. The first question addresses the similarities of the photometric behaviour between images of dissimilar adjacent surfaces. Based on the analysis of the image formation model in dynamic situations, this thesis proposes a model that predicts the colour variations of the region of an image from the variations of the surrounded regions. This proposed model is called the Quotient Relational Model of Regions. This model is valid when the light sources illuminate all of the surfaces included in the model; these surfaces are placed close each other, have similar orientations, and are primarily Lambertian. Under certain circumstances, a linear combination is established between the photometric responses of the regions. Previous work that proposed such a relational model was not found in the scientific literature. The second question examines whether those similarities could be used to correct the unknown photometric variations in an unknown region from the known adjacent regions. A method is proposed, called Linear Correction Mapping, which is capable of providing an affirmative answer under the circumstances previously characterised. A training stage is required to determine the parameters of the model. The method for single camera scenarios is extended to cover non-overlapping multi-camera architectures. To this extent, only several image samples of the same object acquired by all of the cameras are required. Furthermore, both the light variations and the changes in the camera exposure settings are covered by correction mapping. Every image correction method is unsuccessful when the image of the object to be corrected is overexposed or the signal-to-noise ratio is very low. Thus, the third question refers to the control of the acquisition process to obtain an optimal exposure in uncontrolled light conditions. A Camera Exposure Control method is proposed that is capable of holding a suitable exposure provided that the light variations can be collected within the dynamic range of the camera. Each one of the proposed methods was evaluated individually. The methodology of the experiments consisted of first selecting some scenarios that cover the representative situations for which the methods are theoretically valid. Linear Correction Mapping was validated using three object re-identification applications (vehicles, faces and persons) based on the object colour distributions. Camera Exposure Control was proved in an outdoor parking scenario. In addition, several performance indicators were defined to objectively compare the results with other relevant state of the art correction and auto-exposure methods. The results of the evaluation demonstrated that the proposed methods outperform the compared ones in the most situations. Based on the obtained results, the answers to the above-described research questions are affirmative in limited circumstances, that is, the hypothesis of the forecasting, the correction based on it, and the auto exposure are feasible in the situations identified in the thesis, although they cannot be guaranteed in general. Furthermore, the presented work raises new questions and scientific challenges, which are highlighted as future research work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the design and application of the Atmospheric Evaluation and Research Integrated model for Spain (AERIS). Currently, AERIS can provide concentration profiles of NO2, O3, SO2, NH3, PM, as a response to emission variations of relevant sectors in Spain. Results are calculated using transfer matrices based on an air quality modelling system (AQMS) composed by the WRF (meteorology), SMOKE (emissions) and CMAQ (atmospheric-chemical processes) models. The AERIS outputs were statistically tested against the conventional AQMS and observations, revealing a good agreement in both cases. At the moment, integrated assessment in AERIS focuses only on the link between emissions and concentrations. The quantification of deposition, impacts (health, ecosystems) and costs will be introduced in the future. In conclusion, the main asset of AERIS is its accuracy in predicting air quality outcomes for different scenarios through a simple yet robust modelling framework, avoiding complex programming and long computing times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the memristor was first built in 2008 at HP Labs, no end of devices and models have been presented. Also, new applications appear frequently. However, the integration of the device at the circuit level is not straightforward, because available models are still immature and/or suppose high computational loads, making their simulation long and cumbersome. This study assists circuit/systems designers in the integration of memristors in their applications, while aiding model developers in the validation of their proposals. We introduce the use of a memristor application framework to support the work of both the model developer and the circuit designer. First, the framework includes a library with the best-known memristor models, being easily extensible with upcoming models. Systematic modifications have been applied to these models to provide better convergence and significant simulations speedups. Second, a quick device simulator allows the study of the response of the models under different scenarios, helping the designer with the stimuli and operation time selection. Third, fine tuning of the device including parameters variations and threshold determination is also supported. Finally, SPICE/Spectre subcircuit generation is provided to ease the integration of the devices in application circuits. The framework provides the designer with total control overconvergence, computational load, and the evolution of system variables, overcoming usual problems in the integration of memristive devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean energy is a promising resource for renewable electricity generation that presents many advantages, such as being more predictable than wind energy, but also some disadvantages such as large and slow amplitude variations in the generated power. This paper presents a hardware-in-the-loop prototype that allows the study of the electric power profile generated by a wave power plant based on the oscillating water column (OWC) principle. In particular, it facilitates the development of new solutions to improve the intermittent profile of the power fed into the grid or the test of the OWC behavior when facing a voltage dip. Also, to obtain a more realistic model behavior, statistical models of real waves have been implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An immunoglobulin light chain protein was isolated from the urine of an individual (BRE) with systemic amyloidosis. Complete amino acid sequence of the variable region of the light chain (VL) protein established it as a kappa I, which when compared with other kappa I amyloid associated proteins had unique residues, including Ile-34, Leu-40, and Tyr-71. To study the tertiary structure, BRE VL was expressed in Escherichia coli by using a PCR product amplified from the patient BRE's bone marrow DNA. The PCR product was ligated into pCZ11, a thermal-inducible replication vector. Recombinant BRE VL was isolated, purified to homogeneity, and crystallized by using ammonium sulfate as the precipitant. Two crystal forms were obtained. In crystal form I the BRE VL kappa domain crystallizes as a dimer with unit cell constants isomorphous to previously published kappa protein structures. Comparison with a nonamyloid VL kappa domain from patient REI, identified significant differences in position of residues in the hypervariable segments plus variations in framework region (FR) segments 40-46 (FR2) and 66-67 (FR3). In addition, positional differences can be seen along the two types of local diads, corresponding to the monomer-monomer and dimer-dimer interfaces. From the packing diagram, a model for the amyloid light chain (AL) fibril is proposed based on a pseudohexagonal spiral structure with a rise of approximately the width of two dimers per 360 degree turn. This spiral structure could be consistent with the dimensions of amyloid fibrils as determined by electron microscopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies addressing climate variability during the last millennium generally focus on variables with a direct influence on climate variability, like the fast thermal response to varying radiative forcing, or the large-scale changes in atmospheric dynamics (e. g. North Atlantic Oscillation). The ocean responds to these variations by slowly integrating in depth the upper heat flux changes, thus producing a delayed influence on ocean heat content (OHC) that can later impact low frequency SST (sea surface temperature) variability through reemergence processes. In this study, both the externally and internally driven variations of the OHC during the last millennium are investigated using a set of fully coupled simulations with the ECHO-G (coupled climate model ECHAMA4 and ocean model HOPE-G) atmosphere-ocean general circulation model (AOGCM). When compared to observations for the last 55 yr, the model tends to overestimate the global trends and underestimate the decadal OHC variability. Extending the analysis back to the last one thousand years, the main impact of the radiative forcing is an OHC increase at high latitudes, explained to some extent by a reduction in cloud cover and the subsequent increase of short-wave radiation at the surface. This OHC response is dominated by the effect of volcanism in the preindustrial era, and by the fast increase of GHGs during the last 150 yr. Likewise, salient impacts from internal climate variability are observed at regional scales. For instance, upper temperature in the equatorial Pacific is controlled by ENSO (El Nino Southern Oscillation) variability from interannual to multidecadal timescales. Also, both the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO) modulate intermittently the interdecadal OHC variability in the North Pacific and Mid Atlantic, respectively. The NAO, through its influence on North Atlantic surface heat fluxes and convection, also plays an important role on the OHC at multiple timescales, leading first to a cooling in the Labrador and Irminger seas, and later on to a North Atlantic warming, associated with a delayed impact on the AMO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ECHAM-1 T21/LSG coupled ocean-atmosphere general circulation model (GCM) is used to simulate climatic conditions at the last interglacial maximum (Eemian. 125 kyr BP). The results reflect thc expected surface temperature changes (with respect to the control run) due to the amplification (reduction) of the seasonal cycle of insolation in the Northern (Southern) Hemisphere. A number of simulated features agree with previous results from atmospheric GCM simulations e.g. intensified summer southwest monsoons) except in the Northern Hemisphere poleward of 30 degrees N. where dynamical feedback, in the North Atlantic and North Pacific increase zonal temperatures about 1 degrees C above what would be predicted from simple energy balance considerations. As this is the same area where most of the terrestrial geological data originate, this result suggests that previous estimates of Eemian global average temperature might have been biased by sample distribution. This conclusion is supported by the fact that the estimated global temperature increase of only 0.3 degrees C greater than the control run ha, been previously shown to be consistent a with CLIMAP sea surface temperature estimates. Although the Northern Hemisphere summer monsoon is intensified. globally averaged precipitation over land is within about 1% of the present, contravening some geological inferences bur not the deep-sea delta(13)C estimates of terrestrial carbon storage changes. Winter circulation changes in the northern Arabian Sea. driven by strong cooling on land, are as large as summer circulation changes that are the usual focus of interest, suggesting that interpreting variations in the Arabian Sea. sedimentary record solely in terms of the summer monsoon response could sometimes lead to errors. A small monsoonal response over northern South America suggests that interglacial paleotrends in this region were not just due to El Nino variations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sea level variation (SLVtotal) is the sum of two major contributions: steric and mass-induced. The steric SLVsteric is that resulting from the thermal and salinity changes in a given water column. It only involves volume change, hence has no gravitational effect. The mass-induced SLVmass, on the other hand, arises from adding or subtracting water mass to or from the water column and has direct gravitational signature. We examine the closure of the seasonal SLV budget and estimate the relative importance of the two contributions in the Mediterranean Sea as a function of time. We use ocean altimetry data (from TOPEX/Poseidon, Jason 1, ERS, and ENVISAT missions) to estimate SLVtotal, temperature, and salinity data (from the Estimating the Circulation and Climate of the Ocean ocean model) to estimate SLVsteric, and time variable gravity data (from Gravity Recovery and Climate Experiment (GRACE) Project, April 2002 to July 2004) to estimate SLVmass. We find that the annual cycle of SLVtotal in the Mediterranean is mainly driven by SLVsteric but moderately offset by SLVmass. The agreement between the seasonal SLVmass estimations from SLVtotal – SLVsteric and from GRACE is quite remarkable; the annual cycle reaches the maximum value in mid-February, almost half a cycle later than SLVtotal or SLVsteric, which peak by mid-October and mid-September, respectively. Thus, when sea level is rising (falling), the Mediterranean Sea is actually losing (gaining) mass. Furthermore, as SLVmass is balanced by vertical (precipitation minus evaporation, P–E) and horizontal (exchange of water with the Atlantic, Black Sea, and river runoff) mass fluxes, we compared it with the P–E determined from meteorological data to estimate the annual cycle of the horizontal flux.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scoping behavioral variations to dynamic extents is useful to support non-functional concerns that otherwise result in cross-cutting code. Unfortunately, such forms of scoping are difficult to obtain with traditional reflection or aspects. We propose delegation proxies, a dynamic proxy model that supports behavioral intercession through the interception of various interpretation operations. Delegation proxies permit different behavioral variations to be easily composed together. We show how delegation proxies enable behavioral variations that can propagate to dynamic extents. We demonstrate our approach with examples of behavioral variations scoped to dynamic extents that help simplify code related to safety, reliability, and monitoring.