906 resultados para Adaptive object model
Resumo:
We propose a probabilistic object classifier for outdoor scene analysis as a first step in solving the problem of scene context generation. The method begins with a top-down control, which uses the previously learned models (appearance and absolute location) to obtain an initial pixel-level classification. This information provides us the core of objects, which is used to acquire a more accurate object model. Therefore, their growing by specific active regions allows us to obtain an accurate recognition of known regions. Next, a stage of general segmentation provides the segmentation of unknown regions by a bottom-strategy. Finally, the last stage tries to perform a region fusion of known and unknown segmented objects. The result is both a segmentation of the image and a recognition of each segment as a given object class or as an unknown segmented object. Furthermore, experimental results are shown and evaluated to prove the validity of our proposal
Resumo:
When object databases arrived on the scene some ten years ago, they provided database capabilities for previously neglected, complex applications, such as CAD, but were burdened with one inherent teething problem, poor performance. Physical database design is one tool that can provide performance improvements and it is the general area of concern for this thesis. Clustering is one fruitful design technique which can provide improvements in performance. However, clustering in object databases has not been explored in depth and so has not been truly exploited. Further, clustering, although a physical concern, can be determined from the logical model. The object model is richer than previous models, notably the relational model, and so it is anticipated that the opportunities with respect to clustering are greater. This thesis provides a thorough analysis of object clustering strategies with a view to highlighting any links between the object logical and physical model and improving performance. This is achieved by considering all possible types of object logical model construct and the implementation of those constructs in terms of theoretical clusterings strategies to produce actual clustering arrangements. This analysis results in a greater understanding of object clustering strategies, aiding designers in the development process and providing some valuable rules of thumb to support the design process.
Resumo:
This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their MSEs are 0.02314 and 0.15384 respectively.
Resumo:
The proliferation of data throughout the strategic, tactical and operational areas within many organisations, has provided a need for the decision maker to be presented with structured information that is appropriate for achieving allocated tasks. However, despite this abundance of data, managers at all levels in the organisation commonly encounter a condition of ‘information overload’, that results in a paucity of the correct information. Specifically, this thesis will focus upon the tactical domain within the organisation and the information needs of management who reside at this level. In doing so, it will argue that the link between decision making at the tactical level in the organisation, and low-level transaction processing data, should be through a common object model that used a framework based upon knowledge leveraged from co-ordination theory. In order to achieve this, the Co-ordinated Business Object Model (CBOM) was created. Detailing a two-tier framework, the first tier models data based upon four interactive object models, namely, processes, activities, resources and actors. The second tier analyses the data captured by the four object models, and returns information that can be used to support tactical decision making. In addition, the Co-ordinated Business Object Support System (CBOSS), is a prototype tool that has been developed in order to both support the CBOM implementation, and to also demonstrate the functionality of the CBOM as a modelling approach for supporting tactical management decision making. Containing a graphical user interface, the system’s functionality allows the user to create and explore alternative implementations of an identified tactical level process. In order to validate the CBOM, three verification tests have been completed. The results provide evidence that the CBOM framework helps bridge the gap between low level transaction data, and the information that is used to support tactical level decision making.
Resumo:
Object-oriented design and object-oriented languages support the development of independent software components such as class libraries. When using such components, versioning becomes a key issue. While various ad-hoc techniques and coding idioms have been used to provide versioning, all of these techniques have deficiencies - ambiguity, the necessity of recompilation or re-coding, or the loss of binary compatibility of programs. Components from different software vendors are versioned at different times. Maintaining compatibility between versions must be consciously engineered. New technologies such as distributed objects further complicate libraries by requiring multiple implementations of a type simultaneously in a program. This paper describes a new C++ object model called the Shared Object Model for C++ users and a new implementation model called the Object Binary Interface for C++ implementors. These techniques provide a mechanism for allowing multiple implementations of an object in a program. Early analysis of this approach has shown it to have performance broadly comparable to conventional implementations.
Resumo:
Even though Software Transactional Memory (STM) is one of the most promising approaches to simplify concurrent programming, current STM implementations incur significant overheads that render them impractical for many real-sized programs. The key insight of this work is that we do not need to use the same costly barriers for all the memory managed by a real-sized application, if only a small fraction of the memory is under contention lightweight barriers may be used in this case. In this work, we propose a new solution based on an approach of adaptive object metadata (AOM) to promote the use of a fast path to access objects that are not under contention. We show that this approach is able to make the performance of an STM competitive with the best fine-grained lock-based approaches in some of the more challenging benchmarks. (C) 2015 Elsevier Inc. All rights reserved.
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Electrotécnica e de Computadores
Resumo:
A simulação dinâmica de reservatórios de petróleo requer a alocação de permeabilidades equivalentes para todos os blocos. A determinação da permeabilidade equivalente em reservatórios fraturados é uma questão complexa e crítica no fluxograma de modelação, porque é totalmente dependente da geometria 3D sistema de fraturas, e respetivas aberturas, que muitas vezes são mal conhecidos. Para avaliar a permeabilidade equivalente de blocos fraturados, o método do tensor ou de Oda é um dos mais utilizados. É expedito e eficiente mesmo para a sistemas com vários milhões de fraturas. Todavia, na literatura são apontadas algumas críticas, por exemplo, sobrestima a permeabilidade em blocos pouco fraturados e subestima-a em blocos muito fraturados. Este trabalho tem como objetivos, revisitar a problemática da caracterização da permeabilidade equivalente em blocos de reservatórios fraturados. Recorreu-se a um pacote de programas informáticos FROM3D-K (fractures object model – permeability evaluation) desenvolvidos e/ ou adaptados no contexto deste trabalho e que permitem as funcionalidades: (1) simulação estocástica 3D de fraturas; (2) determinação da permeabilidade equivalente pelo método do tensor; (3) determinação da permeabilidade equivalente pelo método de upscaling de microblocos. Estas funções permitem que a mesma rede de fraturas seja avaliada pelos dois métodos. Para a demonstração de resultados mostram-se dois exemplos, o primeiro onde são simuladas fraturas condicionadas a estatísticas sintéticas de orientação, intensidade e abertura e o segundo onde se utilizam dados FMI de poço de um reservatório fraturado.
Resumo:
Solar passive strategies that have been developed in vernacular architecture from different regions are a response to specific climate effects. These strategies are usually simple, low-tech and have low potential environmental impact. For this reason, several studies highlight them as having potential to reduce the demands of non-renewable energy for buildings operation. In this paper, the climatic contrast between northern and southern parts of mainland Portugal is presented, namely the regions of Beira Alta and Alentejo. Additionally, it discusses the contribution of different climate-responsive strategies developed in vernacular architecture from both regions to assure thermal comfort conditions. In Beira Alta, the use of glazed balconies as a strategy to capture solar gains is usual, while in Alentejo the focus is on passive cooling strategies. To understand the effectiveness of these strategies, thermal performances and comfort conditions of two case studies were evaluated based on the adaptive comfort model. Field tests included measurement of hygrothermal parameters and surveys on occupants’ thermal sensation. From the results, it has been found that the case studies have shown a good thermal performance by passive means alone and that the occupants feel comfortable, except during winter where there is the need to use simple heating systems.
Resumo:
OBJECTIVE: Cognitive change over the course of psychodynamic psychotherapy has been postulated by several models, but has rarely been studied. Based on the adaptive skills model (Badgio, Halperin, & Barber, 1999), it is reasonable to expect that very brief dynamic psychotherapy may be associated with change in coping patterns and cognitive errors (also known as cognitive distortions) y. METHOD: N = 50 outpatients presenting with various psychiatric disorders and undergoing 4 sessions of Brief Psychodynamic Intervention (BPI; Despland, Drapeau, & de Roten, 2005; Despland, Michel, & de Roten, 2010) were included in this naturalistic study (mean age: 31 years; 56% female; all Caucasian). Cognitive errors and coping strategies were assessed using the Cognitive Errors Rating Scale (Drapeau et al., 2008) and Coping Patterns Rating Scale (Perry et al., 2005). These observer rated methods were applied to the verbatim transcriptions of all 4 therapy sessions completed by each patient. RESULTS: Results indicate change in both cognitive errors and coping patterns over the course of BPI, including an increase in the Overall Coping Functioning and a decrease in unhelpful coping processes, such as isolation, which reflects a shift in participant appraisal towards stress appraised as a challenge at the end of treatment. These changes predicted symptom change at the end of treatment. While cognitive errors also changed over the course of BPI, no predictive effect was found with regard to symptom change. CONCLUSIONS: These results are interpreted within the framework of common change principles in psychotherapy. Implications and future research are discussed.
Resumo:
Diplomityön tavoitteena oli parantaa oliotietokantaan perustuvan mökkivarausjärjestelmän pitkiä vasteaikoja ja epävakaata toimintaa sekä luoda tietokantaperusta uusille toiminnoille. Ratkaisua etsittiin korvaamalla oliotietokanta relaatiotietokannalla. Diplomityö on osa koko varausjärjestelmän uusintaa. Työn teoreettinen osuus käsittelee sekä olio- että relaatiotietokannan rakennetta ja oliomallin muuttamista relaatiomalliksi. Käytännön osassa esitellään relaatiotietokannan luominen vaiheittain ja kerrotaan tietojen siirron periaatteesta. Relaatiotietokannan havaittiin olevan vakaampi ja tehokkaampi vasteajoiltaan. Samoin sen vaatima levytilan ja muistin määrä oli pienempi kuin oliotietokannalla. Lisäksi siihen todettiin uusien järjestelmien liittämisen olevan yksinkertaisempaa.
Resumo:
Main purpose of this thesis is to introduce a new lossless compression algorithm for multispectral images. Proposed algorithm is based on reducing the band ordering problem to the problem of finding a minimum spanning tree in a weighted directed graph, where set of the graph vertices corresponds to multispectral image bands and the arcs’ weights have been computed using a newly invented adaptive linear prediction model. The adaptive prediction model is an extended unification of 2–and 4–neighbour pixel context linear prediction schemes. The algorithm provides individual prediction of each image band using the optimal prediction scheme, defined by the adaptive prediction model and the optimal predicting band suggested by minimum spanning tree. Its efficiency has been compared with respect to the best lossless compression algorithms for multispectral images. Three recently invented algorithms have been considered. Numerical results produced by these algorithms allow concluding that adaptive prediction based algorithm is the best one for lossless compression of multispectral images. Real multispectral data captured from an airplane have been used for the testing.
Resumo:
Tämä työ on tehty osana MASTO-tutkimushanketta, jonka tarkoituksena on kehittää ohjelmistotestauksen adaptiivinen referenssimalli. Työ toteutettiin tilastollisena tutkimuksena käyttäen survey-menetelmää. Tutkimuksessa haastateltiin 31 organisaatioyksikköä eri puolelta suomea, jotka tekevät keskikriittisiä sovelluksia. Tutkimuksen hypoteeseina oli laadun riippuvuus ohjelmistokehitysmenetelmästä, asiakkaan osallistumisesta, standardin toteutumisesta, asiakassuhteesta, liiketoimintasuuntautuneisuudesta, kriittisyydestä, luottamuksesta ja testauksen tasosta. Hypoteeseista etsittiin korrelaatiota laadun kanssa tekemällä korrelaatio ja regressioanalyysi. Lisäksi tutkimuksessa kartoitettiin minkälaisia ohjelmistokehitykseen liittyviä käytäntöjä, menetelmiä ja työkaluja organisaatioyksiköissä käytettiin, ongelmia ja parannusehdotuksia liittyen ohjelmistotestaukseen, merkittävimpiä tapoja asiakkaan vaikuttamiseksi ohjelmiston laatuun sekä suurimpia hyötyjä ja haittoja ohjelmistokehityksen tai testauksen ulkoistamisessa. Tutkimuksessa havaittiin, että laatu korreloi positiivisesti ja tilastollisesti merkitsevästi testauksen tason, standardin toteutumisen, asiakasosallistumisen suunnitteluvaiheessa sekä asiakasosallistumisen ohjaukseen kanssa, luottamuksen ja yhden asiakassuhteeseen liittyvän osakysymyksen kanssa. Regressioanalyysin perusteella muodostettiin regressioyhtälö, jossa laadun todettiin positiivisesti riippuvan standardin toteutumisesta, asiakasosallistumisesta suunnitteluvaiheessa sekä luottamuksesta.
Resumo:
Työn tavoitteena oli kuvata ja priorisoida toimitusketjun dynaamisen mallinnustyökalun vaatimukset, sekä muodostaa tämän pohjalta ohjelmistokehitystä tukeva oliomalli. Vaatimuksia selvitettiin teoreettisen tarkastelun, aiemmin toteutettujen kyselytutkimusten sekä viiden pilottitapauksen avulla. Toimitusketjun hallinta ei ole pelkästään materiaalivirtojen vaan myös näihin liittyvän informaation hallintaa. Holististen toimitusketjuongelmien mallintaminen edellyttää siis informaatiovirtojen ja niitä saatelevien ohjausmekanisemien mallintamista. Markkinoilla on selkeästi tilaa tukijärjestelmille, jotka mahdollistaisivat multidimensionaalisten - tuotto, aika, palvelu - toimitusketjuongelmien tarkastelun. Systeemidynamiikan teorian mukaisesti oliomallin lähtökohdaksi valittiin tärkeimpien takaisinkytkentäsilmukkojen mallinnus. Takaisinkytkentäsilmukoiden avulla kyetään mallintamaan kompleksisia systeemejä ajan suhteen. Mallinnetut toimitusketjujen takaisinkytkentäsilmukkat ovat operaatio-, ohjaus-, kysyntä- ja strategiasilmukka. Toimitusketjun ohjausmekanismien, sekä systeemidynamiikan perusteiden pohjalta mallinnustyökalun vaatimuksista muodostettiin oliomalli. Muodostettu oliomalli on Locomotiven - toimitusketjun mallinnustyökalun - perusta.