890 resultados para State space model
Resumo:
In this paper, a knowledge-based approach is proposed for the management of temporal information in process control. A common-sense theory of temporal constraints over processes/events, allowing relative temporal knowledge, is employed here as the temporal basis for the system. This theory supports duration reasoning and consistency checking, and accepts relative temporal knowledge which is in a form normally used by human operators. An architecture for process control is proposed which centres on an historical database consisting of events and processes, together with the qualitative temporal relationships between their occurrences. The dynamics of the system is expressed by means of three types of rule: database updating rules, process control rules, and data deletion rules. An example is provided in the form of a life scheduler, to illustrate the database and the rule sets. The example demonstrates the transitions of the database over time, and identifies the procedure in terms of a state transition model for the application. The dividing instant problem for logical inference is discussed with reference to this process control example, and it is shown how the temporal theory employed can be used to deal with the problem.
Resumo:
The aim of this paper is to use Markov modelling to
investigate survival for particular types of kidney patients
in relation to their exposure to anti-hypertensive treatment
drugs. In order to monitor kidney function an intuitive three
point assessment is proposed through the collection of blood
samples in relation to Chronic Kidney Disease for Northern
Ireland patients. A five state Markov Model was devised
using specific transition probabilities for males and
females over all age groups. These transition probabilities
were then adjusted appropriately using relative risk scores
for the event death for different subgroups of patients. The
model was built using TreeAge software package in order to
explore the effects of anti-hypertensive drugs on patients.
Resumo:
We present a Spatio-temporal 2D Models Framework (STMF) for 2D-Pose tracking. Space and time are discretized and a mixture of probabilistic "local models" is learnt associating 2D Shapes and 2D Stick Figures. Those spatio-temporal models generalize well for a particular viewpoint and state of the tracked action but some spatio-temporal discontinuities can appear along a sequence, as a direct consequence of the discretization. To overcome the problem, we propose to apply a Rao-Blackwellized Particle Filter (RBPF) in the 2D-Pose eigenspace, thus interpolating unseen data between view-based clusters. The fitness to the images of the predicted 2D-Poses is evaluated combining our STMF with spatio-temporal constraints. A robust, fast and smooth human motion tracker is obtained by tracking only the few most important dimensions of the state space and by refining deterministically with our STMF.
Resumo:
Context. Protoplanetary disks are vital objects in star and planet formation, possessing all the material, gas and dust, which may form a planetary system orbiting the new star. Small, simple molecules have traditionally been detected in protoplanetary disks; however, in the ALMA era, we expect the molecular inventory of protoplanetary disks to significantly increase.
Aims. We investigate the synthesis of complex organic molecules (COMs) in protoplanetary disks to put constraints on the achievable chemical complexity and to predict species and transitions which may be observable with ALMA.
Methods. We have coupled a 2D steady-state physical model of a protoplanetary disk around a typical T Tauri star with a large gas-grain chemical network including COMs. We compare the resulting column densities with those derived from observations and perform ray-tracing calculations to predict line spectra. We compare the synthesised line intensities with current observations and determine those COMs which may be observable in nearby objects. We also compare the predicted grain-surface abundances with those derived from cometary comae observations.
Results. We find COMs are efficiently formed in the disk midplane via grain-surface chemical reactions, reaching peak grain-surface fractional abundances similar to 10(-6)-10(-4) that of the H nuclei number density. COMs formed on grain surfaces are returned to the gas phase via non-thermal desorption; however, gas-phase species reach lower fractional abundances than their grain-surface equivalents, similar to 10(-12)-10(-7). Including the irradiation of grain mantle material helps build further complexity in the ice through the replenishment of grain-surface radicals which take part in further grain-surface reactions. There is reasonable agreement with several line transitions of H2CO observed towards T Tauri star-disk systems. There is poor agreement with HC3(N) lines observed towards LkCa 15 and GO Tau and we discuss possible explanations for these discrepancies. The synthesised line intensities for CH3OH are consistent with upper limits determined towards all sources. Our models suggest CH3OH should be readily observable in nearby protoplanetary disks with ALMA; however, detection of more complex species may prove challenging, even with ALMA "Full Science" capabilities. Our grain-surface abundances are consistent with those derived from cometary comae observations providing additional evidence for the hypothesis that comets (and other planetesimals) formed via the coagulation of icy grains in the Sun's natal disk.
Resumo:
Organic Rankine Cycle (ORC) is the most commonly used method for recovering energy from small sources of heat. The investigation of the ORC in supercritical condition is a new research area as it has a potential to generate high power and thermal efficiency in a waste heat recovery system. This paper presents a steady state ORC model in supercritical condition and its simulations with a real engine’s exhaust data. The key component of ORC, evaporator, is modelled using finite volume method, modelling of all other components of the waste heat recovery system such as pump, expander and condenser are also presented. The aim of this paper is to investigate the effects of mass flow rate and evaporator outlet temperature on the efficiency of the waste heat recovery process. Additionally, the necessity of maintaining an optimum evaporator outlet temperature is also investigated. Simulation results show that modification of mass flow rate is the key to changing the operating temperature at the evaporator outlet.
Resumo:
Nos últimos anos temos vindo a assistir a uma mudança na forma como a informação é disponibilizada online. O surgimento da web para todos possibilitou a fácil edição, disponibilização e partilha da informação gerando um considerável aumento da mesma. Rapidamente surgiram sistemas que permitem a coleção e partilha dessa informação, que para além de possibilitarem a coleção dos recursos também permitem que os utilizadores a descrevam utilizando tags ou comentários. A organização automática dessa informação é um dos maiores desafios no contexto da web atual. Apesar de existirem vários algoritmos de clustering, o compromisso entre a eficácia (formação de grupos que fazem sentido) e a eficiência (execução em tempo aceitável) é difícil de encontrar. Neste sentido, esta investigação tem por problemática aferir se um sistema de agrupamento automático de documentos, melhora a sua eficácia quando se integra um sistema de classificação social. Analisámos e discutimos dois métodos baseados no algoritmo k-means para o clustering de documentos e que possibilitam a integração do tagging social nesse processo. O primeiro permite a integração das tags diretamente no Vector Space Model e o segundo propõe a integração das tags para a seleção das sementes iniciais. O primeiro método permite que as tags sejam pesadas em função da sua ocorrência no documento através do parâmetro Social Slider. Este método foi criado tendo por base um modelo de predição que sugere que, quando se utiliza a similaridade dos cossenos, documentos que partilham tags ficam mais próximos enquanto que, no caso de não partilharem, ficam mais distantes. O segundo método deu origem a um algoritmo que denominamos k-C. Este para além de permitir a seleção inicial das sementes através de uma rede de tags também altera a forma como os novos centróides em cada iteração são calculados. A alteração ao cálculo dos centróides teve em consideração uma reflexão sobre a utilização da distância euclidiana e similaridade dos cossenos no algoritmo de clustering k-means. No contexto da avaliação dos algoritmos foram propostos dois algoritmos, o algoritmo da “Ground truth automática” e o algoritmo MCI. O primeiro permite a deteção da estrutura dos dados, caso seja desconhecida, e o segundo é uma medida de avaliação interna baseada na similaridade dos cossenos entre o documento mais próximo de cada documento. A análise de resultados preliminares sugere que a utilização do primeiro método de integração das tags no VSM tem mais impacto no algoritmo k-means do que no algoritmo k-C. Além disso, os resultados obtidos evidenciam que não existe correlação entre a escolha do parâmetro SS e a qualidade dos clusters. Neste sentido, os restantes testes foram conduzidos utilizando apenas o algoritmo k-C (sem integração de tags no VSM), sendo que os resultados obtidos indicam que a utilização deste algoritmo tende a gerar clusters mais eficazes.
Resumo:
Dissertação mest., Gestão da Água e da Costa, Universidade do Algarve, 2007
Resumo:
Dissertação de Mestrado em Engenharia de Redes de Comunicação e Multimédia
Resumo:
This paper presents a novel method for the analysis of nonlinear financial and economic systems. The modeling approach integrates the classical concepts of state space representation and time series regression. The analytical and numerical scheme leads to a parameter space representation that constitutes a valid alternative to represent the dynamical behavior. The results reveal that business cycles can be clearly revealed, while the noise effects common in financial indices can elegantly be filtered out of the results.
Resumo:
This paper studies the information content of the chromosomes of 24 species. In a first phase, a scheme inspired in dynamical system state space representation is developed. For each chromosome the state space dynamical evolution is shed into a two dimensional chart. The plots are then analyzed and characterized in the perspective of fractal dimension. This information is integrated in two measures of the species’ complexity addressing its average and variability. The results are in close accordance with phylogenetics pointing quantitative aspects of the species’ genomic complexity.
Resumo:
Biophysical Chemistry 110 (2004) 83–92
Resumo:
Dissertação apresentada ao Instituto Politécnico do Porto para obtenção do Grau de Mestre em Logística Orientada por: Professora Doutora Patrícia Alexandra Gregório Ramos
Resumo:
In this paper we introduce a formation control loop that maximizes the performance of the cooperative perception of a tracked target by a team of mobile robots, while maintaining the team in formation, with a dynamically adjustable geometry which is a function of the quality of the target perception by the team. In the formation control loop, the controller module is a distributed non-linear model predictive controller and the estimator module fuses local estimates of the target state, obtained by a particle filter at each robot. The two modules and their integration are described in detail, including a real-time database associated to a wireless communication protocol that facilitates the exchange of state data while reducing collisions among team members. Simulation and real robot results for indoor and outdoor teams of different robots are presented. The results highlight how our method successfully enables a team of homogeneous robots to minimize the total uncertainty of the tracked target cooperative estimate while complying with performance criteria such as keeping a pre-set distance between the teammates and the target, avoiding collisions with teammates and/or surrounding obstacles.
Resumo:
In many situations probability models are more realistic than deterministic models. Several phenomena occurring in physics are studied as random phenomena changing with time and space. Stochastic processes originated from the needs of physicists.Let X(t) be a random variable where t is a parameter assuming values from the set T. Then the collection of random variables {X(t), t ∈ T} is called a stochastic process. We denote the state of the process at time t by X(t) and the collection of all possible values X(t) can assume, is called state space
Resumo:
Institutionalistische Theorien und hegemoniale Praktiken Globaler Politikgestaltung. Eine neue Beleuchtung der Prämissen Liberaler Demokratischer National-Staatlicher Ordnungen. Deutsche Zusammenfassung: Moderne Sozialwissenschaften, seien es Metatheorien der Internationalen Beziehungen, die Geschichte politischer Ökonomie oder Institutionentheorien, zeigen eine klare Dreiteilung von Weltanschauungen bzw. Paradigmen auf, die sich in allen „großen Debatten“ nachvollziehen lassen: Realismus, Liberalismus und Historischer Materialismus. Diese Grund legend unterschiedlichen Paradigmen lassen sich auch in aktuellen Ansätzen des Institutionalismus aufzeigen, liegen aber quer zu den von anderen Wissenschaftlern (Meyer, Rittberger, Hasenclever, Peters, Zangl) vorgenommenen Kategorisierungen der Institutionalismusschulen, die systemkritische Perspektiven in der Regel ignorieren oder vergleichsweise rudimentär diskutieren. Deshalb entwickelt diese Arbeit einen Vergleich von Institutionalismusschulen entlang der oben skizzierten Weltanschauungen. Das Ziel ist es, fundamentale Unterschiede zwischen den drei Paradigmen zu verdeutlichen und zu zeigen, wie ihre jeweiligen ontologischen und epistemologischen Prämissen die Forschungsdesigns und Methodologien der Institutionalismusschulen beeinflussen. In Teil I arbeite ich deshalb die Grund legenden Prämissen der jeweiligen Paradigmen heraus und entwickle in Teil II und III diesen Prämissen entsprechende Institutionalismus-Schulen, die Kooperation primär als Organisation von unüberwindbarer Rivalität, als Ergebnis zunehmender Konvergenz, oder als Ergebnis und Weiterentwicklung von Prozeduren der Interaktion versteht. Hier greife ich auf zeitgenössische Arbeiten anderer Autoren zurück und liefere damit einen Vergleich des aktuellen Forschungsstandes in allen drei Denktraditionen. Teil II diskutiert die zwei dominanten Institutionalismusschulen und Teil III entwickelt einen eigenen Gramscianischen Ansatz zur Erklärung von internationaler Kooperation und Institutionalisierung. Die übergeordnete These dieser Arbeit lautet, dass die Methodologien der dominanten Institutionalismusschulen teleologische Effekte haben, die aus dem Anspruch auf universell anwendbare, abstrahiert Konzepte resultieren und die Interpretation von Beobachtungen limitieren. Prämissen eines rational handelnden Individuums - entweder Konsequenzen kalkulierend oder Angemessenheit reflektierend – führen dazu, dass Kooperation und Institutionalisierung notwendiger Weise als die beste Lösung für alle Beteiligten in dieser Situation gelten müssen: Institutionen würden nicht bestehen, wenn sie nicht in der Summe allen Mitgliedern (egoistisch oder kooperativ motiviert) nützten. Durch diese interpretative „Brille“ finden wichtige strukturelle Gründe für die Verabschiedung internationaler Abkommen und Teile ihrer Effekte keine Berücksichtigung. Folglich können auch Abweichungen von erwarteten Ergebnissen nicht hinreichend erklärt werden. Meine entsprechende Hypothese lautet, dass systemkritische Kooperation konsistenter erklären können, da sie Akteure, Strukturen und die sie umgebenden Weltanschauungen selbst als analytische Kriterien berücksichtigen. Institutionalisierung wird dann als ein gradueller Prozess politischer Entscheidungsfindung, –umsetzung und –verankerung verstanden, der durch die vorherrschenden Institutionen und Interpretationen von „Realität“ beeinflusst wird. Jede politische Organisation wird als zeitlich-geographisch markierter Staatsraum (state space) verstanden, dessen Mandat die Festlegung von Prozeduren der Interaktion für gesellschaftliche Entwicklung ist. Politische Akteure handeln in Referenz auf diese offiziellen Prozeduren und reproduzieren und/oder verändern sie damit kontinuierlich. Institutionen werden damit als integraler Bestandteil gesellschaftlicher Entwicklungsprozesse verstanden und die Wirkungsmacht von Weltanschauungen – inklusive theoretischer Konzepte - berücksichtigt. Letztere leiten die Wahrnehmung und Interpretation von festgeschriebenen Regeln an und beeinflussen damit ihre empfundene Legitimation und Akzeptanz. Dieser Effekt wurde als „Staatsgeist“ („State Spirit“) von Montesquieu und Hegel diskutiert und von Antonio Gramsci in seiner Hegemonialtheorie aufgegriffen. Seine Berücksichtigung erlaubt eine konsistente Erklärung scheinbar irrationalen oder unangemessenen individuellen Entscheidens, sowie negativer Effekte konsensualer Abkommen. Zur Veranschaulichung der neu entwickelten Konzepte werden in Teil II existierende Fallstudien zur Welthandelsorganisation analysiert und herausgearbeitet, wie Weltanschauungen oder Paradigmen zu unterschiedlichen Erklärungen der Praxis führen. Während Teil II besonderes Augenmerk auf die nicht erklärten und innerhalb der dominanten Paradigmen nicht erklärbaren Beobachtungen legt, wendet Teil III die Gramscianischen Konzepte auf eben diese blinden Stellen an und liefert neue Einsichten. Im Ausblick wird problematisiert, dass scheinbar „neutrale“ wissenschaftliche Studien politische Positionen und Forderungen legitimieren und verdeutlicht im Sinne der gramscianischen Theorie, dass Wissenschaft selbst Teil politischer Auseinandersetzungen ist.