885 resultados para Multiple-model filter


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As one of the most successfully commercialized distributed energy resources, the long-term effects of microturbines (MTs) on the distribution network has not been fully investigated due to the complex thermo-fluid-mechanical energy conversion processes. This is further complicated by the fact that the parameter and internal data of MTs are not always available to the electric utility, due to different ownerships and confidentiality concerns. To address this issue, a general modeling approach for MTs is proposed in this paper, which allows for the long-term simulation of the distribution network with multiple MTs. First, the feasibility of deriving a simplified MT model for long-term dynamic analysis of the distribution network is discussed, based on the physical understanding of dynamic processes that occurred within MTs. Then a three-stage identification method is developed in order to obtain a piecewise MT model and predict electro-mechanical system behaviors with saturation. Next, assisted with the electric power flow calculation tool, a fast simulation methodology is proposed to evaluate the long-term impact of multiple MTs on the distribution network. Finally, the model is verified by using Capstone C30 microturbine experiments, and further applied to the dynamic simulation of a modified IEEE 37-node test feeder with promising results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Li-ion batteries have been widely used in electric vehicles, and battery internal state estimation plays an important role in the battery management system. However, it is technically challenging, in particular, for the estimation of the battery internal temperature and state-ofcharge (SOC), which are two key state variables affecting the battery performance. In this paper, a novel method is proposed for realtime simultaneous estimation of these two internal states, thus leading to a significantly improved battery model for realtime SOC estimation. To achieve this, a simplified battery thermoelectric model is firstly built, which couples a thermal submodel and an electrical submodel. The interactions between the battery thermal and electrical behaviours are captured, thus offering a comprehensive description of the battery thermal and electrical behaviour. To achieve more accurate internal state estimations, the model is trained by the simulation error minimization method, and model parameters are optimized by a hybrid optimization method combining a meta-heuristic algorithm and the least square approach. Further, timevarying model parameters under different heat dissipation conditions are considered, and a joint extended Kalman filter is used to simultaneously estimate both the battery internal states and time-varying model parameters in realtime. Experimental results based on the testing data of LiFePO4 batteries confirm the efficacy of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Collaboration in the public sector is imperative to achieve e-government objectives such as improved efficiency and effectiveness of public administration and improved quality of public services. Collaboration across organizational and institutional boundaries requires public organizations to share e-government systems and services through for instance, interoperable information technology and processes. Demands on public organizations to become more open also require that public organizations adopt new collaborative approaches for inviting and engaging citizens in governmental activities. E-government related collaboration in the public sector is challenging, however, and collaboration initiatives often fail. Public organizations need to learn how to collaborate since forms of e-government collaboration and expected outcomes are mostly unknown. How public organizations can collaborate and the expected outcomes are thus investigated in this thesis by studying multiple collaboration cases on the acquisition and implementation of a particular e-government investment (digital archive). This thesis also investigates how e-government collaboration can be facilitated through artifacts. It is done through a case study, where objects that cross boundaries between collaborating communities in the public sector are studied, and by designing a configurable process model integrating several processes for social services. By using design science, this thesis also investigates how an m-government solution that facilitates collaboration between citizens and public organizations can be designed. The thesis contributes to literature through describing five different modes of interorganizational collaboration in the public sector and the expected benefits from each mode. It also contributes with an instantiation of a configurable process model supporting three open social e-services and with evidence of how it can facilitate collaboration. This thesis further describes how boundary objects facilitate collaboration between different communities in an open government design initiative. It contributes with a designed mobile government solution, thereby providing proof of concept and initial design implications for enabling collaboration with citizens through citizen sourcing (outsourcing a governmental activity to citizens through an open call). This thesis also identifies research streams within e-government collaboration research through a literature review and the thesis contributions are related to the identified research streams. This thesis gives directions for future research by suggesting that future research should focus further on understanding e-government collaboration and how information and communication technology can facilitate collaboration in the public sector. It is suggested that further research should investigate m-government solutions to form design theories. Future research should also examine how value can be co-created in e-government collaboration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In dieser Arbeit werden optische Filterarrays für hochqualitative spektroskopische Anwendungen im sichtbaren (VIS) Wellenlängenbereich untersucht. Die optischen Filter, bestehend aus Fabry-Pérot (FP)-Filtern für hochauflösende miniaturisierte optische Nanospektrometer, basieren auf zwei hochreflektierenden dielektrischen Spiegeln und einer zwischenliegenden Resonanzkavität aus Polymer. Jeder Filter erlaubt einem schmalbandigem spektralen Band (in dieser Arbeit Filterlinie genannt) ,abhängig von der Höhe der Resonanzkavität, zu passieren. Die Effizienz eines solchen optischen Filters hängt von der präzisen Herstellung der hochselektiven multispektralen Filterfelder von FP-Filtern mittels kostengünstigen und hochdurchsatz Methoden ab. Die Herstellung der multiplen Spektralfilter über den gesamten sichtbaren Bereich wird durch einen einzelnen Prägeschritt durch die 3D Nanoimprint-Technologie mit sehr hoher vertikaler Auflösung auf einem Substrat erreicht. Der Schlüssel für diese Prozessintegration ist die Herstellung von 3D Nanoimprint-Stempeln mit den gewünschten Feldern von Filterkavitäten. Die spektrale Sensitivität von diesen effizienten optischen Filtern hängt von der Genauigkeit der vertikalen variierenden Kavitäten ab, die durch eine großflächige ‚weiche„ Nanoimprint-Technologie, UV oberflächenkonforme Imprint Lithographie (UV-SCIL), ab. Die Hauptprobleme von UV-basierten SCIL-Prozessen, wie eine nichtuniforme Restschichtdicke und Schrumpfung des Polymers ergeben Grenzen in der potenziellen Anwendung dieser Technologie. Es ist sehr wichtig, dass die Restschichtdicke gering und uniform ist, damit die kritischen Dimensionen des funktionellen 3D Musters während des Plasmaätzens zur Entfernung der Restschichtdicke kontrolliert werden kann. Im Fall des Nanospektrometers variieren die Kavitäten zwischen den benachbarten FP-Filtern vertikal sodass sich das Volumen von jedem einzelnen Filter verändert , was zu einer Höhenänderung der Restschichtdicke unter jedem Filter führt. Das volumetrische Schrumpfen, das durch den Polymerisationsprozess hervorgerufen wird, beeinträchtigt die Größe und Dimension der gestempelten Polymerkavitäten. Das Verhalten des großflächigen UV-SCIL Prozesses wird durch die Verwendung von einem Design mit ausgeglichenen Volumen verbessert und die Prozessbedingungen werden optimiert. Das Stempeldesign mit ausgeglichen Volumen verteilt 64 vertikal variierenden Filterkavitäten in Einheiten von 4 Kavitäten, die ein gemeinsames Durchschnittsvolumen haben. Durch die Benutzung der ausgeglichenen Volumen werden einheitliche Restschichtdicken (110 nm) über alle Filterhöhen erhalten. Die quantitative Analyse der Polymerschrumpfung wird in iii lateraler und vertikaler Richtung der FP-Filter untersucht. Das Schrumpfen in vertikaler Richtung hat den größten Einfluss auf die spektrale Antwort der Filter und wird durch die Änderung der Belichtungszeit von 12% auf 4% reduziert. FP Filter die mittels des Volumengemittelten Stempels und des optimierten Imprintprozesses hergestellt wurden, zeigen eine hohe Qualität der spektralen Antwort mit linearer Abhängigkeit zwischen den Kavitätshöhen und der spektralen Position der zugehörigen Filterlinien.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Im Rahmen dieser Arbeit wird die Herstellung von miniaturisierten NIR-Spektrometern auf Basis von Fabry-Pérot (FP) Filter Arrays behandelt. Bisher ist die kostengünstige Strukturierung von homogenen und vertikal erweiterten Kavitäten für NIR FP-Filter mittels Nanoimprint Technologie noch nicht verfügbar, weil die Qualität der Schichten des Prägematerials unzureichend ist und die geringe Mobilität der Prägematerialien nicht ausreicht, um die vertikal erweiterten Kavitäten zu füllen. Diese Arbeit konzentriert sich auf die Reduzierung des technischen Aufwands zur Herstellung von homogenen und vertikal erweiterten Kavitäten. Zur Strukturierung der Kavitäten wird ein großflächiger substratkonformer UV-Nanoimprint Prozess (SCIL - Substrate Conformal Imprint Lithoghaphy) verwendet, der auf einem Hybridstempel basiert und Vorteile von harten und weichen Stempeln vereint. Um die genannten Limitierungen zu beseitigen, werden alternative Designs der Kavitäten untersucht und ein neues Prägematerial eingesetzt. Drei Designlösungen zur Herstellung von homogenen und erweiterten Kavitäten werden untersucht und verglichen: (i) Das Aufbringen des Prägematerials mittel mehrfacher Rotationsbeschichtung, um eine höhere Schichtdicke des Prägematerials vor dem Prägeprozess zu erzeugen, (ii) die Verwendung einer hybriden Kavität bestehend aus einer strukturierten Schicht des Prägematerials eingebettet zwischen zwei Siliziumoxidschichten, um die Schichtdicke der organischen Kavität zu erweitern und (iii) die Optimierung des Prägeprozesses durch Verwendung eines neuen Prägematerials. Die mit diesen drei Ansätzen hergestellten FP-Filter Arrays zeigen, hohe Transmissionen (beste Transmission > 90%) und kleine Linienbreiten (Halbwertsbreiten <5 nm).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SANTANA, André M.; SOUZA, Anderson A. S.; BRITTO, Ricardo S.; ALSINA, Pablo J.; MEDEIROS, Adelardo A. D. Localization of a mobile robot based on odometry and natural landmarks using extended Kalman Filter. In: INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, 5., 2008, Funchal, Portugal. Proceedings... Funchal, Portugal: ICINCO, 2008.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] Marine turtles undergo dramatic ontogenic changes in body size and behavior, with the loggerhead sea turtle, Caretta caretta, typically switching from an initial oceanic juvenile stage to one in the neritic, where maturation is reached and breeding migrations are subsequently undertaken every 2-3 years [1-3]. Using satellite tracking, we investigated the migratory movements of adult females from one of the world's largest nesting aggregations at Cape Verde, West Africa. In direct contrast with the accepted life-history model for this species [4], results reveal two distinct adult foraging strategies that appear to be linked to body size. The larger turtles (n = 3) foraged in coastal waters, whereas smaller individuals (n = 7) foraged oceanically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional utility analysis only calculates the value of a given selection procedure over random selection. This assumption is not only an inaccurate representation of staffing policy but also leads to overestimates of a device’s value. This paper presents a more accurate method for computing the validity of a selection battery for when there are multiple selection devices and multiple criteria. Application of the method is illustrated using previous utility analysis work and an actual case of administrative assistants with eight predictors and nine criteria. A final example also is provided that includes these advancements as well as other researchers’ advances in a combined utility model. Results reveal that accounting for multiple criteria and outcomes dramatically reduces the utility estimates of implementing new selection devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a periodic state space model to model monthly temperature data. Additionally, some issues are discussed, as the parameter estimation or the Kalman filter recursions adapted to a periodic model. This framework is applied to monthly long-term temperature time series of Lisbon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the study was to explore how a public, IT services transferor, organization, comprised of autonomous entities, can effectively develop and organize its data center cost recovery mechanisms in a fair manner. The lack of a well-defined model for charges and a cost recovery scheme could cause various problems. For example one entity may be subsidizing the costs of another entity(s). Transfer pricing is in the best interest of each autonomous entity in a CCA. While transfer pricing plays a pivotal role in the price settings of services and intangible assets, TCE focuses on the arrangement at the boundary between entities. TCE is concerned with the costs, autonomy, and cooperation issues of an organization. The theory is concern with the factors that influence intra-firm transaction costs and attempting to manifest the problems involved in the determination of the charges or prices of the transactions. This study was carried out, as a single case study, in a public organization. The organization intended to transfer the IT services of its own affiliated public entities and was in the process of establishing a municipal-joint data center. Nine semi-structured interviews, including two pilot interviews, were conducted with the experts and managers of the case company and its affiliating entities. The purpose of these interviews was to explore the charging and pricing issues of the intra-firm transactions. In order to process and summarize the findings, this study employed qualitative techniques with the multiple methods of data collection. The study, by reviewing the TCE theory and a sample of transfer pricing literature, created an IT services pricing framework as a conceptual tool for illustrating the structure of transferring costs. Antecedents and consequences of the transfer price based on TCE were developed. An explanatory fair charging model was eventually developed and suggested. The findings of the study suggested that the Chargeback system was inappropriate scheme for an organization with affiliated autonomous entities. The main contribution of the study was the application of TP methodologies in the public sphere with no tax issues consideration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical association between a single nucleotide polymorphism (SNP) genotype and a quantitative trait in genome-wide association studies is usually assessed using a linear regression model, or, in the case of non-normally distributed trait values, using the Kruskal-Wallis test. While linear regression models assume an additive mode of inheritance via equi-distant genotype scores, Kruskal-Wallis test merely tests global differences in trait values associated with the three genotype groups. Both approaches thus exhibit suboptimal power when the underlying inheritance mode is dominant or recessive. Furthermore, these tests do not perform well in the common situations when only a few trait values are available in a rare genotype category (disbalance), or when the values associated with the three genotype categories exhibit unequal variance (variance heterogeneity). We propose a maximum test based on Marcus-type multiple contrast test for relative effect sizes. This test allows model-specific testing of either dominant, additive or recessive mode of inheritance, and it is robust against variance heterogeneity. We show how to obtain mode-specific simultaneous confidence intervals for the relative effect sizes to aid in interpreting the biological relevance of the results. Further, we discuss the use of a related all-pairwise comparisons contrast test with range preserving confidence intervals as an alternative to Kruskal-Wallis heterogeneity test. We applied the proposed maximum test to the Bogalusa Heart Study dataset, and gained a remarkable increase in the power to detect association, particularly for rare genotypes. Our simulation study also demonstrated that the proposed non-parametric tests control family-wise error rate in the presence of non-normality and variance heterogeneity contrary to the standard parametric approaches. We provide a publicly available R library nparcomp that can be used to estimate simultaneous confidence intervals or compatible multiplicity-adjusted p-values associated with the proposed maximum test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate the monthly volatility of the US economy from 1968 to 2006 by extending the coincidentindex model of Stock and Watson (1991). Our volatility index, which we call VOLINX, hasfour applications. First, it sheds light on the Great Moderation. VOLINX captures the decrease in thevolatility in the mid-80s as well as the different episodes of stress over the sample period. In the 70sand early 80s the stagflation and the two oil crises marked the pace of the volatility whereas 09/11 is themost relevant shock after the moderation. Second, it helps to understand the economic indicators thatcause volatility. While the main determinant of the coincident index is industrial production, VOLINXis mainly affected by employment and income. Third, it adapts the confidence bands of the forecasts.In and out-of-sample evaluations show that the confidence bands may differ up to 50% with respect to amodel with constant variance. Last, the methodology we use permits us to estimate monthly GDP, whichhas conditional volatility that is partly explained by VOLINX. These applications can be used by policymakers for monitoring and surveillance of the stress of the economy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SANTANA, André M.; SOUZA, Anderson A. S.; BRITTO, Ricardo S.; ALSINA, Pablo J.; MEDEIROS, Adelardo A. D. Localization of a mobile robot based on odometry and natural landmarks using extended Kalman Filter. In: INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, 5., 2008, Funchal, Portugal. Proceedings... Funchal, Portugal: ICINCO, 2008.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of image retrieval and matching is to find and locate object instances in images from a large-scale image database. While visual features are abundant, how to combine them to improve performance by individual features remains a challenging task. In this work, we focus on leveraging multiple features for accurate and efficient image retrieval and matching. We first propose two graph-based approaches to rerank initially retrieved images for generic image retrieval. In the graph, vertices are images while edges are similarities between image pairs. Our first approach employs a mixture Markov model based on a random walk model on multiple graphs to fuse graphs. We introduce a probabilistic model to compute the importance of each feature for graph fusion under a naive Bayesian formulation, which requires statistics of similarities from a manually labeled dataset containing irrelevant images. To reduce human labeling, we further propose a fully unsupervised reranking algorithm based on a submodular objective function that can be efficiently optimized by greedy algorithm. By maximizing an information gain term over the graph, our submodular function favors a subset of database images that are similar to query images and resemble each other. The function also exploits the rank relationships of images from multiple ranked lists obtained by different features. We then study a more well-defined application, person re-identification, where the database contains labeled images of human bodies captured by multiple cameras. Re-identifications from multiple cameras are regarded as related tasks to exploit shared information. We apply a novel multi-task learning algorithm using both low level features and attributes. A low rank attribute embedding is joint learned within the multi-task learning formulation to embed original binary attributes to a continuous attribute space, where incorrect and incomplete attributes are rectified and recovered. To locate objects in images, we design an object detector based on object proposals and deep convolutional neural networks (CNN) in view of the emergence of deep networks. We improve a Fast RCNN framework and investigate two new strategies to detect objects accurately and efficiently: scale-dependent pooling (SDP) and cascaded rejection classifiers (CRC). The SDP improves detection accuracy by exploiting appropriate convolutional features depending on the scale of input object proposals. The CRC effectively utilizes convolutional features and greatly eliminates negative proposals in a cascaded manner, while maintaining a high recall for true objects. The two strategies together improve the detection accuracy and reduce the computational cost.