931 resultados para Spatial dynamic modeling
Resumo:
With the progress of computer technology, computers are expected to be more intelligent in the interaction with humans, presenting information according to the user's psychological and physiological characteristics. However, computer users with visual problems may encounter difficulties on the perception of icons, menus, and other graphical information displayed on the screen, limiting the efficiency of their interaction with computers. In this dissertation, a personalized and dynamic image precompensation method was developed to improve the visual performance of the computer users with ocular aberrations. The precompensation was applied on the graphical targets before presenting them on the screen, aiming to counteract the visual blurring caused by the ocular aberration of the user's eye. A complete and systematic modeling approach to describe the retinal image formation of the computer user was presented, taking advantage of modeling tools, such as Zernike polynomials, wavefront aberration, Point Spread Function and Modulation Transfer Function. The ocular aberration of the computer user was originally measured by a wavefront aberrometer, as a reference for the precompensation model. The dynamic precompensation was generated based on the resized aberration, with the real-time pupil diameter monitored. The potential visual benefit of the dynamic precompensation method was explored through software simulation, with the aberration data from a real human subject. An "artificial eye'' experiment was conducted by simulating the human eye with a high-definition camera, providing objective evaluation to the image quality after precompensation. In addition, an empirical evaluation with 20 human participants was also designed and implemented, involving image recognition tests performed under a more realistic viewing environment of computer use. The statistical analysis results of the empirical experiment confirmed the effectiveness of the dynamic precompensation method, by showing significant improvement on the recognition accuracy. The merit and necessity of the dynamic precompensation were also substantiated by comparing it with the static precompensation. The visual benefit of the dynamic precompensation was further confirmed by the subjective assessments collected from the evaluation participants.
Resumo:
Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
Monitoring user interaction activities provides the basis for creating a user model that can be used to predict user behaviour and enable user assistant services. The BaranC framework provides components that perform UI monitoring (and collect all associated context data), builds a user model, and supports services that make use of the user model. In this case study, a Next-App prediction service is built to demonstrate the use of the framework and to evaluate the usefulness of such a prediction service. Next-App analyses a user's data, learns patterns, makes a model for a user, and finally predicts based on the user model and current context, what application(s) the user is likely to want to use. The prediction is pro-active and dynamic; it is dynamic both in responding to the current context, and also in that it responds to changes in the user model, as might occur over time as a user's habits change. Initial evaluation of Next-App indicates a high-level of satisfaction with the service.
Resumo:
Résumé : En imagerie médicale, il est courant d’associer plusieurs modalités afin de tirer profit des renseignements complémentaires qu’elles fournissent. Par exemple, la tomographie d’émission par positrons (TEP) peut être combinée à l’imagerie par résonance magnétique (IRM) pour obtenir à la fois des renseignements sur les processus biologiques et sur l’anatomie du sujet. Le but de ce projet est d’explorer les synergies entre l’IRM et la TEP dans le cadre d’analyses pharmacocinétiques. Plus spécifiquement, d’exploiter la haute résolution spatiale et les renseignements sur la perfusion et la perméabilité vasculaire fournis par l’IRM dynamique avec agent de contraste afin de mieux évaluer ces mêmes paramètres pour un radiotraceur TEP injecté peu de temps après. L’évaluation précise des paramètres de perfusion du radiotraceur devrait permettre de mieux quantifier le métabolisme et de distinguer l’accumulation spécifique et non spécifique. Les travaux ont porté sur deux radiotraceurs de TEP (18F-fluorodésoxyglucose [FDG] et 18F-fluoroéthyle-tyrosine [FET]) ainsi que sur un agent de contraste d’IRM (acide gadopentétique [Gd DTPA]) dans un modèle de glioblastome chez le rat. Les images ont été acquises séquentiellement, en IRM, puis en TEP, et des prélèvements sanguins ont été effectués afin d’obtenir une fonction d’entrée artérielle (AIF) pour chaque molécule. Par la suite, les images obtenues avec chaque modalité ont été recalées et l’analyse pharmacocinétique a été effectuée par régions d’intérêt (ROI) et par voxel. Pour le FDG, un modèle irréversible à 3 compartiments (2 tissus) a été utilisé conformément à la littérature. Pour la FET, il a été déterminé qu’un modèle irréversible à 2 tissus pouvait être appliqué au cerveau et à la tumeur, alors qu’un modèle réversible à 2 tissus convenait aux muscles. La possibilité d’effectuer une conversion d’AIF (sanguine ou dérivée de l’image) entre le Gd DTPA et la FET, ou vice versa, a aussi été étudiée et s’est avérée faisable dans le cas des AIF sanguines obtenues à partir de l’artère caudale, comme c’est le cas pour le FDG. Finalement, l’analyse pharmacocinétique combinée IRM et TEP a relevé un lien entre la perfusion du Gd-DTPA et du FDG, ou de la FET, pour les muscles, mais elle a démontré des disparités importantes dans la tumeur. Ces résultats soulignent la complexité du microenvironnement tumoral (p. ex. coexistence de divers modes de transport pour une même molécule) et les nombreux défis rencontrées lors de sa caractérisation chez le petit animal.
Resumo:
We used the results of the Spanish Otter Survey of 1994–1996, a Geographic Information System and stepwise multiple logistic regression to model otter presence/absence data in the continental Spanish UTM 10 10-km squares. Geographic situation, indicators of human activity such as highways and major urban centers, and environmental variables related with productivity, water availability, altitude, and environmental energy were included in a logistic model that correctly classified about 73% of otter presences and absences. We extrapolated the model to the adjacent territory of Portugal, and increased the model’s spatial resolution by extrapolating it to 1 1-km squares in the whole Iberian Peninsula. The model turned out to be rather flexible, predicting, for instance, the species to be very restricted to the courses of rivers in some areas, and more widespread in others. This allowed us to determine areas where otter populations may be more vulnerable to habitat changes or harmful human interventions. # 2003 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents the development of a combined experimental and numerical approach to study the anaerobic digestion of both the wastes produced in a biorefinery using yeast for biodiesel production and the wastes generated in the preceding microbial biomass production. The experimental results show that it is possible to valorise through anaerobic digestion all the tested residues. In the implementation of the numerical model for anaerobic digestion, a procedure for the identification of its parameters needs to be developed. A hybrid search Genetic Algorithm was used, followed by a direct search method. In order to test the procedure for estimation of parameters, first noise-free data was considered and a critical analysis of the results obtain so far was undertaken. As a demonstration of its application, the procedure was applied to experimental data.
Resumo:
Understanding what characterizes patients who suffer great delays in diagnosis of pulmonary tuberculosis is of great importance when establishing screening strategies to better control TB. Greater delays in diagnosis imply a higher chance for susceptible individuals to become infected by a bacilliferous patient. A Structured Additive Regression model is attempted in this study in order to potentially contribute to a better characterization of bacilliferous prevalence in Portugal. The main findings suggest the existence of significant regional differences in Portugal, with the fact of being female and/or alcohol dependent contributing to an increased delay-time in diagnosis, while being dependent on intravenous drugs and/or being diagnosed with HIV are factors that increase the chance of an earlier diagnosis of pulmonary TB. A decrease in 2010 to 77% on treatment success in Portugal underlines the importance of conducting more research aimed at better TB control strategies.
Resumo:
Recent technological development has enabled research- ers to gather data from different performance scenarios while considering players positioning and action events within a specific time frame. This technology varies from global positioning systems to radio frequency devices and computer vision tracking, to name the most common, and aims to collect players’ time motion data and enable the dynamical analysis of performance. Team sports—and in particular, invasion games—present a complex dynamic by nature based on the interaction between 2 opposing sides trying to outperform 1 another. During match and training situations, players’ actions are coupled to their performance context at different interaction levels. As expected, ball, teammates’, and opponents’ positioning play an important role in this interaction process. But other factors, such as final score, teams’ development level, and players’ expertise, seem to affect the match dynamics. In this symposium, we will focus on how different constraints affect invasion games dynamics during both match and training situations. This relation will be established while underpinning the importance of these effects to game teaching and performance optimization. Regarding the match, different performance indicators based on spatial-temporal relations between players and teams will be presented to reveal the interaction processes that form the crucial component of game analysis. Considering the training, this symposium will address the relationship of small-sided games with full- sized matches and will present how players’ dynamical interaction affects different performance indicators.
Resumo:
The strategic orientations of a firm are considered crucial for enhancing firm performance and their impact can be even greater when associated with dynamic capabilities, particularly in complex and dynamic environments. This study empirically analyzes the relationship between market, entrepreneurial and learning orientations, dynamic capabilities, and performance using an integrative approach hitherto little explored. Using a sample of 209 knowledge intensive business service firms, this paper applies structural equation modeling to explore both direct effects of strategic orientations and the mediating role of dynamic capabilities on performance. The study demonstrates that learning orientation and one of the dimensions of entrepreneurial orientation have a direct positive effect on performance. On the other hand, dynamic capabilities mediate the relationships between some of the strategic orientations and firm performance. Overall, when dynamic capabilities are combined with the appropriate strategic orientations, they enhance firm performance. This paper contributes to a better understanding of the knowledge economy, given the important role knowledge intensive business services play in such a dynamic and pivotal sector.
Resumo:
Knowledge of the geographical distribution of timber tree species in the Amazon is still scarce. This is especially true at the local level, thereby limiting natural resource management actions. Forest inventories are key sources of information on the occurrence of such species. However, areas with approved forest management plans are mostly located near access roads and the main industrial centers. The present study aimed to assess the spatial scale effects of forest inventories used as sources of occurrence data in the interpolation of potential species distribution models. The occurrence data of a group of six forest tree species were divided into four geographical areas during the modeling process. Several sampling schemes were then tested applying the maximum entropy algorithm, using the following predictor variables: elevation, slope, exposure, normalized difference vegetation index (NDVI) and height above the nearest drainage (HAND). The results revealed that using occurrence data from only one geographical area with unique environmental characteristics increased both model overfitting to input data and omission error rates. The use of a diagonal systematic sampling scheme and lower threshold values led to improved model performance. Forest inventories may be used to predict areas with a high probability of species occurrence, provided they are located in forest management plan regions representative of the environmental range of the model projection area.
Resumo:
Alzheimer's disease (AD) is the most common neurodegenerative disease in elderly. Donepezil is the first-line drug used for AD. In section one, the experimental activity was oriented to evaluate and characterize molecular and cellular mechanisms that contribute to neurodegeneration induced by the Aβ1-42 oligomers (Aβ1-42O) and potential neuroprotective effects of the hybrids feruloyl-donepezil compound called PQM130. The effects of PQM130 were compared to donepezil in a murine AD model, obtained by intracerebroventricular (i.c.v.) injection of Aβ1-42O. The intraperitoneal administration of PQM130 (0.5-1 mg/kg) after i.c.v. Aβ1-42O injection improved learning and memory, protecting mice against spatial cognition decline. Moreover, it reduced oxidative stress, neuroinflammation and neuronal apoptosis, induced cell survival and protein synthesis in mice hippocampus. PQM130 modulated different pathways than donepezil, and it is more effective in counteracting Aβ1-42O damage. The section two of the experimental activity was focused on studying a loss of function variants of ABCA7. GWA studies identified mutations in the ABCA7 gene as a risk factor for AD. The mechanism through which ABCA7 contributes to AD is not clear. ABCA7 regulates lipid metabolism and critically controls phagocytic function. To investigate ABCA7 functions, CRISPR/Cas9 technology was used to engineer human iPSCs and to carry the genetic variant Y622*, which results in a premature stop codon, causing ABCA7 loss-of-function. From iPSCs, astrocytes were generated. This study revealed the effects of ABCA7 loss in astrocytes. ABCA7 Y622* mutation induced dysfunctional endocytic trafficking, impairing Aβ clearance, lipid dysregulation and cell homeostasis disruption, alterations that could contribute to AD. Though further studies are needed to confirm the PQM130 neuroprotective role and ABCA7 function in AD, the provided results showed a better understanding of AD pathophysiology, a new therapeutic approach to treat AD, and illustrated an innovative methodology for studying the disease.
Resumo:
This thesis deals with optimization techniques and modeling of vehicular networks. Thanks to the models realized with the integer linear programming (ILP) and the heuristic ones, it was possible to study the performances in 5G networks for the vehicular. Thanks to Software-defined networking (SDN) and Network functions virtualization (NFV) paradigms it was possible to study the performances of different classes of service, such as the Ultra Reliable Low Latency Communications (URLLC) class and enhanced Mobile BroadBand (eMBB) class, and how the functional split can have positive effects on network resource management. Two different protection techniques have been studied: Shared Path Protection (SPP) and Dedicated Path Protection (DPP). Thanks to these different protections, it is possible to achieve different network reliability requirements, according to the needs of the end user. Finally, thanks to a simulator developed in Python, it was possible to study the dynamic allocation of resources in a 5G metro network. Through different provisioning algorithms and different dynamic resource management techniques, useful results have been obtained for understanding the needs in the vehicular networks that will exploit 5G. Finally, two models are shown for reconfiguring backup resources when using shared resource protection.
Resumo:
The fast development of Information Communication Technologies (ICT) offers new opportunities to realize future smart cities. To understand, manage and forecast the city's behavior, it is necessary the analysis of different kinds of data from the most varied dataset acquisition systems. The aim of this research activity in the framework of Data Science and Complex Systems Physics is to provide stakeholders with new knowledge tools to improve the sustainability of mobility demand in future cities. Under this perspective, the governance of mobility demand generated by large tourist flows is becoming a vital issue for the quality of life in Italian cities' historical centers, which will worsen in the next future due to the continuous globalization process. Another critical theme is sustainable mobility, which aims to reduce private transportation means in the cities and improve multimodal mobility. We analyze the statistical properties of urban mobility of Venice, Rimini, and Bologna by using different datasets provided by companies and local authorities. We develop algorithms and tools for cartography extraction, trips reconstruction, multimodality classification, and mobility simulation. We show the existence of characteristic mobility paths and statistical properties depending on transport means and user's kinds. Finally, we use our results to model and simulate the overall behavior of the cars moving in the Emilia Romagna Region and the pedestrians moving in Venice with software able to replicate in silico the demand for mobility and its dynamic.
Resumo:
Air pollution is one of the greatest health risks in the world. At the same time, the strong correlation with climate change, as well as with Urban Heat Island and Heat Waves, make more intense the effects of all these phenomena. A good air quality and high levels of thermal comfort are the big goals to be reached in urban areas in coming years. Air quality forecast help decision makers to improve air quality and public health strategies, mitigating the occurrence of acute air pollution episodes. Air quality forecasting approaches combine an ensemble of models to provide forecasts from global to regional air pollution and downscaling for selected countries and regions. The development of models dedicated to urban air quality issues requires a good set of data regarding the urban morphology and building material characteristics. Only few examples of air quality forecast system at urban scale exist in the literature and often they are limited to selected cities. This thesis develops by setting up a methodology for the development of a forecasting tool. The forecasting tool can be adapted to all cities and uses a new parametrization for vegetated areas. The parametrization method, based on aerodynamic parameters, produce the urban spatially varying roughness. At the core of the forecasting tool there is a dispersion model (urban scale) used in forecasting mode, and the meteorological and background concentration forecasts provided by two regional numerical weather forecasting models. The tool produces the 1-day spatial forecast of NO2, PM10, O3 concentration, the air temperature, the air humidity and BLQ-Air index values. The tool is automatized to run every day, the maps produced are displayed on the e-Globus platform, updated every day. The results obtained indicate that the forecasting output were in good agreement with the observed measurements.