965 resultados para target model
Resumo:
The similarity of issues and geographical proximity have led the Visegrad 4 countries (V4) to undertake closer collaboration in natural gas policy, notably by agreeing on a common security of supply strategy, including regional emergency planning, and a common implementation of the Gas Target Model (GTM) that European regulators have proposed for the medium-long term design of the EU gas market, and which has been endorsed by the Madrid Regulatory Forum. As a contribution to this collaboration, the present paper will analyse how the GTM may be implemented in the V4 region, with a view to maximize the benefits that arise from joint implementation. A most relevant conclusion of the GTM is that markets should be large enough to attract market players and investments, so that sufficient diversity of sources may be reached and market power indicators are kept below dangerous levels. In most cases, this requires physical and/or virtual interconnection of present markets, which is also useful to achieve the required security of supply standards, as envisaged in the Regulation 994/2010/EC.
Resumo:
The purpose of this paper is to survey and assess the state-of-the-art in automatic target recognition for synthetic aperture radar imagery (SAR-ATR). The aim is not to develop an exhaustive survey of the voluminous literature, but rather to capture in one place the various approaches for implementing the SAR-ATR system. This paper is meant to be as self-contained as possible, and it approaches the SAR-ATR problem from a holistic end-to-end perspective. A brief overview for the breadth of the SAR-ATR challenges is conducted. This is couched in terms of a single-channel SAR, and it is extendable to multi-channel SAR systems. Stages pertinent to the basic SAR-ATR system structure are defined, and the motivations of the requirements and constraints on the system constituents are addressed. For each stage in the SAR-ATR processing chain, a taxonomization methodology for surveying the numerous methods published in the open literature is proposed. Carefully selected works from the literature are presented under the taxa proposed. Novel comparisons, discussions, and comments are pinpointed throughout this paper. A two-fold benchmarking scheme for evaluating existing SAR-ATR systems and motivating new system designs is proposed. The scheme is applied to the works surveyed in this paper. Finally, a discussion is presented in which various interrelated issues, such as standard operating conditions, extended operating conditions, and target-model design, are addressed. This paper is a contribution toward fulfilling an objective of end-to-end SAR-ATR system design.
Resumo:
Tämän tutkimuksen tavoitteena oli luoda tavoitemalli TeliaSonera Finland Oyj:n vianhallintapalveluille. Tavoitemalli tuli muodostaa niin, että se tukee laadukasta ja tuottavaa teollista palvelutuotantoa. Tavoitemalli muodostettiin suhteellisen laajan teoriakatsauksen perusteella, joka tehtiin tiedon näkökulmasta. Palvelujen hallinnan viitekehyksessä tutkittiin palvelujen suorituskykyä, jossa erityisesti paneuduttiin laatuun, tuottavuuteen ja palvelutuotannon psykososiaaliseen työympäristöön. Tähän kokonaisuuteen yhdistettiin tutkimustietoa palvelujen teollistamisesta, sekä otettiin huomioon tietointensiivisen organisaation yleiset menestystekijät. Näin muodostettiin tietointensiivisen palvelutuotannon hallinnan ja kehittämisen viitekehys, jota sovellettiin vianhallintapalvelujen tavoitetilan muodostamiseen. Vianhallinnan tavoitetila testattiinosin empiirisesti, mutta lisätutkimusta tarvitaan työssä muodostetun teoreettisen viitekehyksen arviointiin.
Resumo:
The drug discovery process is facing new challenges in the evaluation process of the lead compounds as the number of new compounds synthesized is increasing. The potentiality of test compounds is most frequently assayed through the binding of the test compound to the target molecule or receptor, or measuring functional secondary effects caused by the test compound in the target model cells, tissues or organism. Modern homogeneous high-throughput-screening (HTS) assays for purified estrogen receptors (ER) utilize various luminescence based detection methods. Fluorescence polarization (FP) is a standard method for ER ligand binding assay. It was used to demonstrate the performance of two-photon excitation of fluorescence (TPFE) vs. the conventional one-photon excitation method. As result, the TPFE method showed improved dynamics and was found to be comparable with the conventional method. It also held potential for efficient miniaturization. Other luminescence based ER assays utilize energy transfer from a long-lifetime luminescent label e.g. lanthanide chelates (Eu, Tb) to a prompt luminescent label, the signal being read in a time-resolved mode. As an alternative to this method, a new single-label (Eu) time-resolved detection method was developed, based on the quenching of the label by a soluble quencher molecule when displaced from the receptor to the solution phase by an unlabeled competing ligand. The new method was paralleled with the standard FP method. It was shown to yield comparable results with the FP method and found to hold a significantly higher signal-tobackground ratio than FP. Cell-based functional assays for determining the extent of cell surface adhesion molecule (CAM) expression combined with microscopy analysis of the target molecules would provide improved information content, compared to an expression level assay alone. In this work, immune response was simulated by exposing endothelial cells to cytokine stimulation and the resulting increase in the level of adhesion molecule expression was analyzed on fixed cells by means of immunocytochemistry utilizing specific long-lifetime luminophore labeled antibodies against chosen adhesion molecules. Results showed that the method was capable of use in amulti-parametric assay for protein expression levels of several CAMs simultaneously, combined with analysis of the cellular localization of the chosen adhesion molecules through time-resolved luminescence microscopy inspection.
Resumo:
La transformation de modèles consiste à transformer un modèle source en un modèle cible conformément à des méta-modèles source et cible. Nous distinguons deux types de transformations. La première est exogène où les méta-modèles source et cible représentent des formalismes différents et où tous les éléments du modèle source sont transformés. Quand elle concerne un même formalisme, la transformation est endogène. Ce type de transformation nécessite généralement deux étapes : l’identification des éléments du modèle source à transformer, puis la transformation de ces éléments. Dans le cadre de cette thèse, nous proposons trois principales contributions liées à ces problèmes de transformation. La première contribution est l’automatisation des transformations des modèles. Nous proposons de considérer le problème de transformation comme un problème d'optimisation combinatoire où un modèle cible peut être automatiquement généré à partir d'un nombre réduit d'exemples de transformations. Cette première contribution peut être appliquée aux transformations exogènes ou endogènes (après la détection des éléments à transformer). La deuxième contribution est liée à la transformation endogène où les éléments à transformer du modèle source doivent être détectés. Nous proposons une approche pour la détection des défauts de conception comme étape préalable au refactoring. Cette approche est inspirée du principe de la détection des virus par le système immunitaire humain, appelée sélection négative. L’idée consiste à utiliser de bonnes pratiques d’implémentation pour détecter les parties du code à risque. La troisième contribution vise à tester un mécanisme de transformation en utilisant une fonction oracle pour détecter les erreurs. Nous avons adapté le mécanisme de sélection négative qui consiste à considérer comme une erreur toute déviation entre les traces de transformation à évaluer et une base d’exemples contenant des traces de transformation de bonne qualité. La fonction oracle calcule cette dissimilarité et les erreurs sont ordonnées selon ce score. Les différentes contributions ont été évaluées sur d’importants projets et les résultats obtenus montrent leurs efficacités.
Resumo:
L’ingénierie dirigée par les modèles (IDM) est un paradigme d’ingénierie du logiciel bien établi, qui préconise l’utilisation de modèles comme artéfacts de premier ordre dans les activités de développement et de maintenance du logiciel. La manipulation de plusieurs modèles durant le cycle de vie du logiciel motive l’usage de transformations de modèles (TM) afin d’automatiser les opérations de génération et de mise à jour des modèles lorsque cela est possible. L’écriture de transformations de modèles demeure cependant une tâche ardue, qui requiert à la fois beaucoup de connaissances et d’efforts, remettant ainsi en question les avantages apportés par l’IDM. Afin de faire face à cette problématique, de nombreux travaux de recherche se sont intéressés à l’automatisation des TM. L’apprentissage de transformations de modèles par l’exemple (TMPE) constitue, à cet égard, une approche prometteuse. La TMPE a pour objectif d’apprendre des programmes de transformation de modèles à partir d’un ensemble de paires de modèles sources et cibles fournis en guise d’exemples. Dans ce travail, nous proposons un processus d’apprentissage de transformations de modèles par l’exemple. Ce dernier vise à apprendre des transformations de modèles complexes en s’attaquant à trois exigences constatées, à savoir, l’exploration du contexte dans le modèle source, la vérification de valeurs d’attributs sources et la dérivation d’attributs cibles complexes. Nous validons notre approche de manière expérimentale sur 7 cas de transformations de modèles. Trois des sept transformations apprises permettent d’obtenir des modèles cibles parfaits. De plus, une précision et un rappel supérieurs à 90% sont enregistrés au niveau des modèles cibles obtenus par les quatre transformations restantes.
Resumo:
Abstract Background A large number of probabilistic models used in sequence analysis assign non-zero probability values to most input sequences. To decide when a given probability is sufficient the most common way is bayesian binary classification, where the probability of the model characterizing the sequence family of interest is compared to that of an alternative probability model. We can use as alternative model a null model. This is the scoring technique used by sequence analysis tools such as HMMER, SAM and INFERNAL. The most prevalent null models are position-independent residue distributions that include: the uniform distribution, genomic distribution, family-specific distribution and the target sequence distribution. This paper presents a study to evaluate the impact of the choice of a null model in the final result of classifications. In particular, we are interested in minimizing the number of false predictions in a classification. This is a crucial issue to reduce costs of biological validation. Results For all the tests, the target null model presented the lowest number of false positives, when using random sequences as a test. The study was performed in DNA sequences using GC content as the measure of content bias, but the results should be valid also for protein sequences. To broaden the application of the results, the study was performed using randomly generated sequences. Previous studies were performed on aminoacid sequences, using only one probabilistic model (HMM) and on a specific benchmark, and lack more general conclusions about the performance of null models. Finally, a benchmark test with P. falciparum confirmed these results. Conclusions Of the evaluated models the best suited for classification are the uniform model and the target model. However, the use of the uniform model presents a GC bias that can cause more false positives for candidate sequences with extreme compositional bias, a characteristic not described in previous studies. In these cases the target model is more dependable for biological validation due to its higher specificity.
Resumo:
The report explores the problem of detecting complex point target models in a MIMO radar system. A complex point target is a mathematical and statistical model for a radar target that is not resolved in space, but exhibits varying complex reflectivity across the different bistatic view angles. The complex reflectivity can be modeled as a complex stochastic process whose index set is the set of all the bistatic view angles, and the parameters of the stochastic process follow from an analysis of a target model comprising a number of ideal point scatterers randomly located within some radius of the targets center of mass. The proposed complex point targets may be applicable to statistical inference in multistatic or MIMO radar system. Six different target models are summarized here – three 2-dimensional (Gaussian, Uniform Square, and Uniform Circle) and three 3-dimensional (Gaussian, Uniform Cube, and Uniform Sphere). They are assumed to have different distributions on the location of the point scatterers within the target. We develop data models for the received signals from such targets in the MIMO radar system with distributed assets and partially correlated signals, and consider the resulting detection problem which reduces to the familiar Gauss-Gauss detection problem. We illustrate that the target parameter and transmit signal have an influence on the detector performance through target extent and the SNR respectively. A series of the receiver operator characteristic (ROC) curves are generated to notice the impact on the detector for varying SNR. Kullback–Leibler (KL) divergence is applied to obtain the approximate mean difference between density functions the scatterers assume inside the target models to show the change in the performance of the detector with target extent of the point scatterers.
Resumo:
Actualmente existe un gran interés orientado hacia el mercado del gas natural. Son muchas las razones por las que este combustible se posiciona como uno de los más importantes dentro del panorama energético mundial. Además de que salvaría el hueco dejado por el carbón y el petróleo, supone una alternativa mucho más limpia que se podría desarrollar aún más tanto a nivel doméstico, industrial como en el mundo de los transportes. La industria del gas natural está cambiando rápidamente fundamentalmente por la aparición del gas no convencional y sus técnicas de extracción. Por lo que se está produciendo un cambio en la economía de la producción de gas así como en la dinámica y los movimientos del GNL a lo largo de todo el planeta. El propósito de este estudio es enfocar el estado del sector y mercado del gas natural en todo el mundo y de esta forma subrayar las principales regiones que marcan la tendencia general de los precios de todo el planeta. Además, este trabajo reflejará los pronósticos esperados para los próximos años así como un resumen de las tendencias que se han seguido hasta el momento. Particularmente, se centrará la atención en el movimiento hacia los sistemas basados en forma de hub que comenzaron en EE.UU. y que llegaron a Reino Unido y al continente Europeo a principios del S.XX. Esta tendencia es la que se pretende implantar en España con el fin de conseguir una mayor competitividad, flexibilidad y liquidez en los precios y en el sistema gasista. De esta forma, poco a poco se irá construyendo la estructura hacia un Mercado Único Europeo que es el objetivo final que plantean los organismos de los estados miembros. Sin embargo, para la puesta en marcha de este nuevo modelo es necesario realizar una serie de cambios en el sistema como la modificación de la Ley de Hidrocarburos, la designación de un Operador de Mercado, elaboración de una serie de reglas para regular el mercado así como fomentar la liquidez del mercado. Cuando tenga lugar el cambio regulatorio, la liquidez del sistema español incrementará y se dará la oportunidad de crear nuevas formas para balancear las carteras de gas y establecer nuevas estrategias para gestionar el riesgo. No obstante, antes de que se hagan efectivos los cambios en la legislación, se implantaría uno de los modelos planteados en el “Gas Target Model”, el denominado “Modelo de Asignación de Capacidad Implícita”. La introducción de este modelo sería un primer paso para la integración de un mercado de gas sin la necesidad de afrontar un cambio legislativo, lo que serviría de VIII impulso para alcanzar el “Modelo de Área de Mercado” que sería el mejor para el sistema gasista español y se conectaría ampliamente con el resto de mercados europeos. Las conclusiones del estudio en relación a la formación del nuevo modelo en forma de hub plantean la necesidad de aprovechar al máximo la nueva situación y conseguir implantar el hub lo antes posible para poder dotar al sistema de mayor competencia y liquidez. Además, el sistema español debe aprovechar su gran capacidad y moderna infraestructura para convertir al país en la entrada de gas del suroeste de Europa ampliando así la seguridad de suministro de los países miembros. Otra conclusión que se puede extraer del informe es la necesidad de ampliar el índice de penetración del gas en España e incentivar el consumo frente a otros combustibles fósiles como el carbón y el petróleo. Esto situaría al gas natural como la principal energía de respaldo con respecto a las renovables y permitiría disminuir los precios del kilovatio hora del gas natural. El estudio y análisis de la dinámica que se viene dando en la industria del gas en el mundo es fundamental para poder anticiparse y planear las mejores estrategias frente a los cambios que poco a poco irán modificando el sector y el mercado gasista. ABSTRACT There is a great deal of focus on the natural gas market at the moment. Whether you view natural gas as bridging the gap between coal/oil and an altogether cleaner solution yet to be determined, or as a destination fuel which will be used not only for heating and gas fired generation but also as transportation fuel, there is no doubt that natural gas will have an increasingly important role to play in the global energy landscape. The natural gas industry is changing rapidly, as shale gas exploration changes the economics of gas production and LNG connects regions across the globe. The purpose of this study is to outline the present state of the global gas industry highlighting the differing models around the world. This study will pay particular attention to the move towards hub based pricing that has taken hold first in the US and over the past decade across the UK and Continental Europe. In the coming years the Spanish model will move towards hub based pricing. As gas market regulatory change takes hold, liquidity in the Spanish gas market will increase, bringing with it new ways to balance gas portfolios and placing an increasing focus on managing price risk. This study will in turn establish the links between the changes that have taken place in other markets as a way to better understanding how the Spanish market will evolve in the coming years.
Resumo:
This CEPS Task Force Report focuses on whether there is a need to adapt the EU’s electricity market design and if so, the options for doing so. In a first step, it analyses the current market trends by distinguishing between their causes and their consequences. Then, the current blueprint of EU power market design – the target model – is briefly introduced, followed by a discussion of the shortcomings of the current approach and the challenges in finding suitable solutions. The final chapter offers an inventory of solutions differentiating between recommendations shared among Task Force members and non-consensual options.
Resumo:
With the advent of high-performance computing devices, deep neural networks have gained a lot of popularity in solving many Natural Language Processing tasks. However, they are also vulnerable to adversarial attacks, which are able to modify the input text in order to mislead the target model. Adversarial attacks are a serious threat to the security of deep neural networks, and they can be used to craft adversarial examples that steer the model towards a wrong decision. In this dissertation, we propose SynBA, a novel contextualized synonym-based adversarial attack for text classification. SynBA is based on the idea of replacing words in the input text with their synonyms, which are selected according to the context of the sentence. We show that SynBA successfully generates adversarial examples that are able to fool the target model with a high success rate. We demonstrate three advantages of this proposed approach: (1) effective - it outperforms state-of-the-art attacks by semantic similarity and perturbation rate, (2) utility-preserving - it preserves semantic content, grammaticality, and correct types classified by humans, and (3) efficient - it performs attacks faster than other methods.
Resumo:
A model predictive controller (MPC) is proposed, which is robustly stable for some classes of model uncertainty and to unknown disturbances. It is considered as the case of open-loop stable systems, where only the inputs and controlled outputs are measured. It is assumed that the controller will work in a scenario where target tracking is also required. Here, it is extended to the nominal infinite horizon MPC with output feedback. The method considers an extended cost function that can be made globally convergent for any finite input horizon considered for the uncertain system. The method is based on the explicit inclusion of cost contracting constraints in the control problem. The controller considers the output feedback case through a non-minimal state-space model that is built using past output measurements and past input increments. The application of the robust output feedback MPC is illustrated through the simulation of a low-order multivariable system.
Resumo:
The erosion depth profile of planar targets in balanced and unbalanced magnetron cathodes with cylindrical symmetry is measured along the target radius. The magnetic fields have rotational symmetry. The horizontal and vertical components of the magnetic field B are measured at points above the cathode target with z = 2 x 10(-3) m. The experimental data reveal that the target erosion depth profile is a function of the angle. made by B with a horizontal line defined by z = 2 x 10(-3) m. To explain this dependence a simplified model of the discharge is developed. In the scope of the model, the pathway lengths of the secondary electrons in the pre-sheath region are calculated by analytical integration of the Lorentz differential equations. Weighting these lengths by using the distribution law of the mean free path of the secondary electrons, we estimate the densities of the ionizing events over the cathode and the relative flux of the sputtered atoms. The expression so deduced correlates for the first time the erosion depth profile of the target with the angle theta. The model shows reasonably good fittings to the experimental target erosion depth profiles confirming that ionization occurs mainly in the pre-sheath zone.
Resumo:
Hong Kong’s currency is pegged to the US dollar in a currency board arrangement. In autumn 2003, the Hong Kong dollar appreciated from close to 7.80 per US dollar to 7.70, as investors feared that the currency board would be abandoned. In the wake of this appreciation, the monetary authorities revamped the one-sided currency board mechanism into a symmetric two-sided system with a narrow exchange rate band. This paper reviews the characteristics of the new currency board arrangement and embeds a theoretical soft edge target zone model typifying many intermediate regimes, to explain the notable achievement of speculative peace and credibility since May 2005.