872 resultados para Discontinuous dynamic systems
Resumo:
Kulturlandschaften als Ausdruck einer über viele Jahrhunderte währenden intensiven Interaktion zwischen Menschen und der sie umgebenden natürlichen Umwelt, sind ein traditionelles Forschungsobjekt der Geographie. Mensch/Natur-Interaktionen führen zu Veränderungen der natürlichen Umwelt, indem Menschen Landschaften kultivieren und modifizieren. Die Mensch/Natur-Interaktionen im Weinbau sind intensiv rückgekoppelt, Veränderungen der natürlichen Umwelt wirken auf die in den Kulturlandschaften lebenden und wirtschaftenden Winzer zurück und beeinflussen deren weiteres Handeln, was wiederum Einfluss auf die Entwicklung der gesamten Weinbau-Kulturlandschaft hat. Kulturlandschaft wird aus diesem Grund als ein heterogenes Wirkungsgefüge sozialer und natürlicher Elemente konzeptionalisiert, an dessen Entwicklung soziale und natürliche Elemente gleichzeitig und wechselseitig beteiligt sind. Grundlegend für die vorliegende Arbeit ist die Überzeugung, dass sich Kulturlandschaften durch Mensch/Natur-Interaktionen permanent neu organisieren und nie in einen Gleichgewichtszustand geraten, sondern sich ständig weiterentwickeln und wandeln. Die Komplexitätstheorie bietet hierfür die geeignete theoretische Grundlage. Sie richtet ihren Fokus auf die Entwicklung und den Wandel von Systemen und sucht dabei nach den Funktionsweisen von Systemzusammenhängen, um ein Verständnis für das Gesamtsystemverhalten von nicht-linearen dynamischen Systemen zu erreichen. Auf der Grundlage der Komplexitätstheorie wird ein Untersuchungsschema entwickelt, dass es ermöglich, die sozio-ökonomischen und raum-strukturellen Veränderungsprozesse in der Kulturlandschaftsentwicklung als sich wechselseitig beeinflussenden Systemzusammenhang zu erfassen. Die Rekonstruktion von Entwicklungsphasen, die Analysen von raum-strukturellen Mustern und Akteurskonstellationen sowie die Identifikation von Bifurkationspunkten in der Systemgeschichte sind dabei von übergeordneter Bedeutung. Durch die Untersuchung sowohl der physisch-räumlichen als auch der sozio-ökonomischen Dimension der Kulturlandschaftsentwicklung im Weinbau des Oberen Mittelrheintals soll ein Beitrag für die geographische Erforschung von Mensch/Natur-Interaktionen im Schnittstellenbereich von Physischer Geographie und Humangeographie geleistet werden. Die Anwendung des Untersuchungsschemas erfolgt auf den Weinbau im Oberen Mittelrheintal. Das Anbaugebiet ist seit vielen Jahrzehnten einem starken Rückgang an Weinbaubetrieben und Rebfläche unterworfen. Die rückläufigen Entwicklungen seit 1950 verliefen dabei nicht linear, sondern differenzierten das System in unterschiedliche Entwicklungspfade aus. Die Betriebsstrukturen und die Rahmenbedingungen im Weinbau veränderten sich grundlegend, was sichtbare Spuren in der Kulturlandschaft hinterließ. Dies zu rekonstruieren, zu analysieren und die zu verschiedenen Phasen der Entwicklung bedeutenden externen und internen Einflussfaktoren zu identifizieren, soll dazu beitragen, ein tief greifendes Verständnis für das selbstorganisierte Systemverhalten zu generieren und darauf basierende Handlungsoptionen für zukünftige Eingriffe in die Systementwicklung aufzuzeigen
Resumo:
Wind energy has been one of the most growing sectors of the nation’s renewable energy portfolio for the past decade, and the same tendency is being projected for the upcoming years given the aggressive governmental policies for the reduction of fossil fuel dependency. Great technological expectation and outstanding commercial penetration has shown the so called Horizontal Axis Wind Turbines (HAWT) technologies. Given its great acceptance, size evolution of wind turbines over time has increased exponentially. However, safety and economical concerns have emerged as a result of the newly design tendencies for massive scale wind turbine structures presenting high slenderness ratios and complex shapes, typically located in remote areas (e.g. offshore wind farms). In this regard, safety operation requires not only having first-hand information regarding actual structural dynamic conditions under aerodynamic action, but also a deep understanding of the environmental factors in which these multibody rotating structures operate. Given the cyclo-stochastic patterns of the wind loading exerting pressure on a HAWT, a probabilistic framework is appropriate to characterize the risk of failure in terms of resistance and serviceability conditions, at any given time. Furthermore, sources of uncertainty such as material imperfections, buffeting and flutter, aeroelastic damping, gyroscopic effects, turbulence, among others, have pleaded for the use of a more sophisticated mathematical framework that could properly handle all these sources of indetermination. The attainable modeling complexity that arises as a result of these characterizations demands a data-driven experimental validation methodology to calibrate and corroborate the model. For this aim, System Identification (SI) techniques offer a spectrum of well-established numerical methods appropriated for stationary, deterministic, and data-driven numerical schemes, capable of predicting actual dynamic states (eigenrealizations) of traditional time-invariant dynamic systems. As a consequence, it is proposed a modified data-driven SI metric based on the so called Subspace Realization Theory, now adapted for stochastic non-stationary and timevarying systems, as is the case of HAWT’s complex aerodynamics. Simultaneously, this investigation explores the characterization of the turbine loading and response envelopes for critical failure modes of the structural components the wind turbine is made of. In the long run, both aerodynamic framework (theoretical model) and system identification (experimental model) will be merged in a numerical engine formulated as a search algorithm for model updating, also known as Adaptive Simulated Annealing (ASA) process. This iterative engine is based on a set of function minimizations computed by a metric called Modal Assurance Criterion (MAC). In summary, the Thesis is composed of four major parts: (1) development of an analytical aerodynamic framework that predicts interacted wind-structure stochastic loads on wind turbine components; (2) development of a novel tapered-swept-corved Spinning Finite Element (SFE) that includes dampedgyroscopic effects and axial-flexural-torsional coupling; (3) a novel data-driven structural health monitoring (SHM) algorithm via stochastic subspace identification methods; and (4) a numerical search (optimization) engine based on ASA and MAC capable of updating the SFE aerodynamic model.
Resumo:
It is system dynamics that determines the function of cells, tissues and organisms. To develop mathematical models and estimate their parameters are an essential issue for studying dynamic behaviors of biological systems which include metabolic networks, genetic regulatory networks and signal transduction pathways, under perturbation of external stimuli. In general, biological dynamic systems are partially observed. Therefore, a natural way to model dynamic biological systems is to employ nonlinear state-space equations. Although statistical methods for parameter estimation of linear models in biological dynamic systems have been developed intensively in the recent years, the estimation of both states and parameters of nonlinear dynamic systems remains a challenging task. In this report, we apply extended Kalman Filter (EKF) to the estimation of both states and parameters of nonlinear state-space models. To evaluate the performance of the EKF for parameter estimation, we apply the EKF to a simulation dataset and two real datasets: JAK-STAT signal transduction pathway and Ras/Raf/MEK/ERK signaling transduction pathways datasets. The preliminary results show that EKF can accurately estimate the parameters and predict states in nonlinear state-space equations for modeling dynamic biochemical networks.
Resumo:
Dynamic systems, especially in real-life applications, are often determined by inter-/intra-variability, uncertainties and time-varying components. Physiological systems are probably the most representative example in which population variability, vital signal measurement noise and uncertain dynamics render their explicit representation and optimization a rather difficult task. Systems characterized by such challenges often require the use of adaptive algorithmic solutions able to perform an iterative structural and/or parametrical update process towards optimized behavior. Adaptive optimization presents the advantages of (i) individualization through learning of basic system characteristics, (ii) ability to follow time-varying dynamics and (iii) low computational cost. In this chapter, the use of online adaptive algorithms is investigated in two basic research areas related to diabetes management: (i) real-time glucose regulation and (ii) real-time prediction of hypo-/hyperglycemia. The applicability of these methods is illustrated through the design and development of an adaptive glucose control algorithm based on reinforcement learning and optimal control and an adaptive, personalized early-warning system for the recognition and alarm generation against hypo- and hyperglycemic events.
Resumo:
Aim: The landscape metaphor allows viewing corrective experiences (CE) as pathway to a state with relatively lower 'tension' (local minimum). However, such local minima are not easily accessible but obstructed by states with relatively high tension (local maxima) according to the landscape metaphor (Caspar & Berger, 2012). For example, an individual with spider phobia has to transiently tolerate high levels of tension during an exposure therapy to access the local minimum of habituation. To allow for more specific therapeutic guidelines and empirically testable hypotheses, we advance the landscape metaphor to a scientific model which bases on motivational processes. Specifically, we conceptualize CEs as available but unusual trajectories (=pathways) through a motivational space. The dimensions of the motivational state are set up by basic motives such as need for agency or attachment. Methods: Dynamic system theory is used to model motivational states and trajectories using mathematical equations. Fortunately, these equations have easy-to-comprehend and intuitive visual representations similar to the landscape metaphor. Thus, trajectories that represent CEs are informative and action guiding for both therapists and patients without knowledge on dynamic systems. However, the mathematical underpinnings of the model allow researchers to deduct hypotheses for empirical testing. Results: First, the results of simulations of CEs during exposure therapy in anxiety disorders are presented and compared to empirical findings. Second, hypothetical CEs in an autonomy-attachment conflict are reported from a simulation study. Discussion: Preliminary clinical implications for the evocation of CEs are drawn after a critical discussion of the proposed model.
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
Modern sensor technologies and simulators applied to large and complex dynamic systems (such as road traffic networks, sets of river channels, etc.) produce large amounts of behavior data that are difficult for users to interpret and analyze. Software tools that generate presentations combining text and graphics can help users understand this data. In this paper we describe the results of our research on automatic multimedia presentation generation (including text, graphics, maps, images, etc.) for interactive exploration of behavior datasets. We designed a novel user interface that combines automatically generated text and graphical resources. We describe the general knowledge-based design of our presentation generation tool. We also present applications that we developed to validate the method, and a comparison with related work.
Resumo:
El escaso crecimiento de los países del África subsahariana, lleva a la necesidad de plantear un tipo de modelo económico que se adapte a sus especiales características y que en definitiva, conduzca a las sociedades que viven en estos países a un aumento de su calidad de vida, mediante mejoras en todos los campos sociales tales como: la enseñanza, la salud y la nutrición, que puedan ayudar a transformar las perspectivas del crecimiento económico, especialmente en los países objeto de estudio, que se caracterizan por presentar bajos ingresos y escaso desarrollo humano. Se puede concluir, por tanto, diciendo que en definitiva, el fin es el desarrollo humano y que el crecimiento económico es un medio. El propósito del crecimiento económico debe ser enriquecer la vida de la gente. Los adelantos a corto plazo en materia de desarrollo humano son posibles, merced a un mayor crecimiento económico que a su vez no debe desligarse del respeto por el medioambiente y el entorno. Para conseguir estos objetivos, se plantea en la presente tesis un modelo económico, elaborado siguiendo las directrices de la Dinámica de Sistemas, mediante el uso del programa informático VENSIM. El modelo planteado se basa en la producción de energía eléctrica, que sería capaz de abastecer a una población y generar unos excedentes que podrían ser vendidos y las ganancias reinvertidas para impulsar el crecimiento económico de la población a la que abastece. ABSTRACT Low growth in sub-Saharan Africa, leading to the need to establish a type of economic model that suits their special characteristics and ultimately lead to societies that live in these countries to increase human capacity through improvements in all social fields such as education, health and nutrition that can help transform the prospects for economic growth, especially in the countries under study, which are characterized by low income and low human development. It can be concluded, therefore, saying that ultimately, the end is human development and economic growth is a means. The purpose of economic growth should be to enrich the lives of people. The short-term advances in human development are possible, thanks to higher economic growth which in turn should not be separated from respect for the environment and intone. To achieve these objectives, we propose in this thesis an economic model, developed under the guidance of dynamic systems, using the computer program VENSIM. The proposed model is based on the production of electricity, which would be able to supply a population and generate a surplus that could be sold and the proceeds reinvested to boost economic growth in the population it serves.
Resumo:
Las nuevas tendencias de compartir archivos multimedia a través de redes abiertas, demanda el uso de mejores técnicas de encriptación que garanticen la integridad, disponibilidad y confidencialidad, manteniendo y/o mejorando la eficiencia del proceso de cifrado sobre estos archivos. Hoy en día es frecuente la transferencia de imágenes a través de medios tecnológicos, siendo necesario la actualización de las técnicas de encriptación existentes y mejor aún, la búsqueda de nuevas alternativas. Actualmente los algoritmos criptográficos clásicos son altamente conocidos en medio de la sociedad informática lo que provoca mayor vulnerabilidad, sin contar los altos tiempos de procesamiento al momento de ser utilizados, elevando la probabilidad de ser descifrados y minimizando la disponibilidad inmediata de los recursos. Para disminuir estas probabilidades, el uso de la teoría de caos surge como una buena opción para ser aplicada en un algoritmo que tome partida del comportamiento caótico de los sistemas dinámicos, y aproveche las propiedades de los mapas logísticos para elevar el nivel de robustez en el cifrado. Es por eso que este trabajo propone la creación de un sistema criptográfico basado sobre una arquitectura dividida en dos etapas de confusión y difusión. Cada una de ellas utiliza una ecuación logística para generar números pseudoaleatorios que permitan desordenar la posición del píxel y cambiar su intensidad en la escala de grises. Este proceso iterativo es determinado por la cantidad total de píxeles de una imagen. Finalmente, toda la lógica de cifrado es ejecutada sobre la tecnología CUDA que permite el procesamiento en paralelo. Como aporte sustancial, se propone una nueva técnica de encriptación vanguardista de alta sensibilidad ante ruidos externos manteniendo no solo la confidencialidad de la imagen, sino también la disponibilidad y la eficiencia en los tiempos de proceso.---ABSTRACT---New trends to share multimedia files over open networks, demand the best use of encryption techniques to ensure the integrity, availability and confidentiality, keeping and/or improving the efficiency of the encryption process on these files. Today it is common to transfer pictures through technological networks, thus, it is necessary to update existing techniques encryption, and even better, the searching of new alternatives. Nowadays, classic cryptographic algorithms are highly known in the midst of the information society which not only causes greater vulnerability, but high processing times when this algorithms are used. It raise the probability of being deciphered and minimizes the immediate availability of resources. To reduce these odds, the use of chaos theory emerged as a good option to be applied on an algorithm that takes advantage of chaotic behavior of dynamic systems, and take logistic maps’ properties to raise the level of robustness in the encryption. That is why this paper proposes the creation of a cryptographic system based on an architecture divided into two stages: confusion and diffusion. Each stage uses a logistic equation to generate pseudorandom numbers that allow mess pixel position and change their intensity in grayscale. This iterative process is determined by the total number of pixels of an image. Finally, the entire encryption logic is executed on the CUDA technology that enables parallel processing. As a substantial contribution, it propose a new encryption technique with high sensitivity on external noise not only keeping the confidentiality of the image, but also the availability and efficiency in processing times.
Resumo:
A partir de conflito interpessoal entre gestores de empresa de médio porte, esta pesquisa realiza estudo de caso com objetivo de investigar os reflexos da mediação na fluência de interação e na afetividade dentro da empresa, enfocando 19 participantes, em três níveis: diretoria, equipe da diretoria e coordenadores. Tem como sustentação teórica as abordagens de mediação: Tradicional de Harvard; Transformativa; Narrativo-Circular e Facilitação, delineadas a partir do modelo dos sistemas dinâmicos da Teoria da Complexidade. Após caracterização inicial da empresa, utiliza-se de técnicas de pré-mediação, mediação e facilitação em grupo, analisando-as qualitativamente. Com preocupação sobre a racionalidade dos resultados sobre os reflexos do trabalho de mediação, compõe questionário sobre fluência de interação e afetividade na empresa, QFI. Os resultados do questionário comprovam os da análise da mediação, sendo que 51% dos funcionários assinalam alterações positivas na interação e na afetividade na empresa como um todo. Os pontos nevrálgicos, apontados pelos participantes como reformulados na mediação referem-se a: Autoritarismo; Muita Pressão; Falta Transparência; Co-Responsável; Cisão entre as áreas Administrativa e Técnica (Cisão AA-AT); Centralização e Escuta Insuficiente. Os dados indicam uma abertura sistêmica na tomada de consciência dos conflitos, associada a uma maior responsabilidade conjunta em tentar resolvê-los, através do gerenciamento integrado e dinâmico de competências individuais, intra e inter-grupais na empresa. O estudo considera, portanto, que a mediação pode ser vista como uma abordagem alternativa de resolução de conflitos, com resultados positivos ao meio organizacional. Devido ao fato de as técnicas de mediação não estarem ainda muito difundidas em nossa realidade, recomenda a necessidade de novas pesquisas , diversificando seu foco em empresas de vários tamanhos e segmentos.
Resumo:
A partir de conflito interpessoal entre gestores de empresa de médio porte, esta pesquisa realiza estudo de caso com objetivo de investigar os reflexos da mediação na fluência de interação e na afetividade dentro da empresa, enfocando 19 participantes, em três níveis: diretoria, equipe da diretoria e coordenadores. Tem como sustentação teórica as abordagens de mediação: Tradicional de Harvard; Transformativa; Narrativo-Circular e Facilitação, delineadas a partir do modelo dos sistemas dinâmicos da Teoria da Complexidade. Após caracterização inicial da empresa, utiliza-se de técnicas de pré-mediação, mediação e facilitação em grupo, analisando-as qualitativamente. Com preocupação sobre a racionalidade dos resultados sobre os reflexos do trabalho de mediação, compõe questionário sobre fluência de interação e afetividade na empresa, QFI. Os resultados do questionário comprovam os da análise da mediação, sendo que 51% dos funcionários assinalam alterações positivas na interação e na afetividade na empresa como um todo. Os pontos nevrálgicos, apontados pelos participantes como reformulados na mediação referem-se a: Autoritarismo; Muita Pressão; Falta Transparência; Co-Responsável; Cisão entre as áreas Administrativa e Técnica (Cisão AA-AT); Centralização e Escuta Insuficiente. Os dados indicam uma abertura sistêmica na tomada de consciência dos conflitos, associada a uma maior responsabilidade conjunta em tentar resolvê-los, através do gerenciamento integrado e dinâmico de competências individuais, intra e inter-grupais na empresa. O estudo considera, portanto, que a mediação pode ser vista como uma abordagem alternativa de resolução de conflitos, com resultados positivos ao meio organizacional. Devido ao fato de as técnicas de mediação não estarem ainda muito difundidas em nossa realidade, recomenda a necessidade de novas pesquisas , diversificando seu foco em empresas de vários tamanhos e segmentos.
Resumo:
Um dos aspectos regulatórios fundamentais para o mercado imobiliário no Brasil são os limites para obtenção de financiamento no Sistema Financeiro de Habitação. Esses limites podem ser definidos de forma a aumentar ou reduzir a oferta de crédito neste mercado, alterando o comportamento dos seus agentes e, com isso, o preço de mercado dos imóveis. Neste trabalho, propomos um modelo de formação de preços no mercado imobiliário brasileiro com base no comportamento dos agentes que o compõem. Os agentes vendedores têm comportamento heterogêneo e são influenciados pela demanda histórica, enquanto que os agentes compradores têm o seu comportamento determinado pela disponibilidade de crédito. Esta disponibilidade de crédito, por sua vez, é definida pelos limites para concessão de financiamento no Sistema Financeiro de Habitação. Verificamos que o processo markoviano que descreve preço de mercado converge para um sistema dinâmico determinístico quando o número de agentes aumenta, e analisamos o comportamento deste sistema dinâmico. Mostramos qual é a família de variáveis aleatórias que representa o comportamento dos agentes vendedores de forma que o sistema apresente um preço de equilíbrio não trivial, condizente com a realidade. Verificamos ainda que o preço de equilíbrio depende não só das regras de concessão de financiamento no Sistema Financeiro de Habitação, como também do preço de reserva dos compradores e da memória e da sensibilidade dos vendedores a alterações na demanda. A memória e a sensibilidade dos vendedores podem levar a oscilações de preços acima ou abaixo do preço de equilíbrio (típicas de processos de formação de bolhas); ou até mesmo a uma bifurcação de Neimark-Sacker, quando o sistema apresenta dinâmica oscilatória estável.
Resumo:
A pesquisa apresenta uma adaptação do modelo matemático de lógica nebulosa. A adaptação é uma alternativa capaz de representar o comportamento de uma variável subjetiva ao longo de um intervalo de tempo, assim como tratar variáveis estáticas (como o modelo computacional existente). Pesquisas realizadas apontam para uma lacuna no tratamento de variáveis dinâmicas (dependência no tempo) e a proposta permite que o contexto em que as variáveis estão inseridas tenha um papel no entendimento e tomada de decisão de problemas com estas características. Modelos computacionais existentes tratam a questão temporal como sequenciador de eventos ou custo, sem considerar a influência de fenômenos passados na condição corrente, ao contrário do modelo proposto que permite uma contribuição dos acontecimentos anteriores no entendimento e tratamento do estado atual. Apenas para citar alguns exemplos, o uso da solução proposta pode ser aplicado na determinação de nível de conforto em transporte público ou auxiliar na aferição de grau de risco de investimentos no mercado de ações. Em ambos os casos, comparações realizadas entre o modelo de lógica nebulosa existente e a adaptação sugerida apontam uma diferença no resultado final que pode ser entendida como uma maior qualidade na informação de suporte à tomada de decisão.
Resumo:
Examining a team’s performance from a physical point of view their momentum might indicate unexpected turning points in defeat or success. Physicists describe this value as to require some effort to be started, but also that it is relatively easy to keep it going once a sufficient level is reached (Reed and Hughes, 2006). Unlike football, rugby, handball and many more sports, a regular volleyball match is not limited by time but by points that need to be gathered. Every minute more than one point is won by either one team or the other. That means a series of successive points enlarges the gap between the teams making it more and more difficult to catch up with the leading one. This concept of gathering momentum, or the reverse in a performance, can give the coaches, athletes and sports scientists further insights into winning and losing performances. Momentum investigations also contain dependencies between performances or questions if future performances are reliant upon past streaks. Squash and volleyball share the characteristic of being played up to a certain amount of points. Squash was examined according to the momentum of players by Hughes et al. (2006). The initial aim was to expand normative profiles of elite squash players using momentum graphs of winners and errors to explore ‘turning points’ in a performance. Dynamic systems theory has enabled the definition of perturbations in sports exhibiting rhythms (Hughes et al., 2000; McGarry et al., 2002; Murray et al., 2008), and how players and teams cause these disruptions of rhythm can inform on the way they play, these techniques also contribute to profiling methods. Together with the analysis of one’s own performance it is essential to have an understanding of your oppositions’ tactical strengths and weaknesses. By modelling the oppositions’ performance it is possible to predict certain outcomes and patterns, and therefore intervene or change tactics before the critical incident occurs. The modelling of competitive sport is an informative analytic technique as it directs the attention of the modeller to the critical aspects of data that delineate successful performance (McGarry & Franks, 1996). Using tactical performance profiles to pull out and visualise these critical aspects of performance, players can build justified and sophisticated tactical plans. The area is discussed and reviewed, critically appraising the research completed in this element of Performance Analysis.
Resumo:
In this study, a methodology based in a dynamical framework is proposed to incorporate additional sources of information to normalized difference vegetation index (NDVI) time series of agricultural observations for a phenological state estimation application. The proposed implementation is based on the particle filter (PF) scheme that is able to integrate multiple sources of data. Moreover, the dynamics-led design is able to conduct real-time (online) estimations, i.e., without requiring to wait until the end of the campaign. The evaluation of the algorithm is performed by estimating the phenological states over a set of rice fields in Seville (SW, Spain). A Landsat-5/7 NDVI series of images is complemented with two distinct sources of information: SAR images from the TerraSAR-X satellite and air temperature information from a ground-based station. An improvement in the overall estimation accuracy is obtained, especially when the time series of NDVI data is incomplete. Evaluations on the sensitivity to different development intervals and on the mitigation of discontinuities of the time series are also addressed in this work, demonstrating the benefits of this data fusion approach based on the dynamic systems.