992 resultados para Mapping Problem
Resumo:
Magnetoencephalography (MEG) can be used to reconstruct neuronal activity with high spatial and temporal resolution. However, this reconstruction problem is ill-posed, and requires the use of prior constraints in order to produce a unique solution. At present there are a multitude of inversion algorithms, each employing different assumptions, but one major problem when comparing the accuracy of these different approaches is that often the true underlying electrical state of the brain is unknown. In this study, we explore one paradigm, retinotopic mapping in the primary visual cortex (V1), for which the ground truth is known to a reasonable degree of accuracy, enabling the comparison of MEG source reconstructions with the true electrical state of the brain. Specifically, we attempted to localize, using a beanforming method, the induced responses in the visual cortex generated by a high contrast, retinotopically varying stimulus. Although well described in primate studies, it has been an open question whether the induced gamma power in humans due to high contrast gratings derives from V1 rather than the prestriate cortex (V2). We show that the beanformer source estimate in the gamma and theta bands does vary in a manner consistent with the known retinotopy of V1. However, these peak locations, although retinotopically organized, did not accurately localize to the cortical surface. We considered possible causes for this discrepancy and suggest that improved MEG/magnetic resonance imaging co-registration and the use of more accurate source models that take into account the spatial extent and shape of the active cortex may, in future, improve the accuracy of the source reconstructions.
Resumo:
This article presents the principal results of the Ph.D. thesis Intelligent systems in bioinformatics: mapping and merging anatomical ontologies by Peter Petrov, successfully defended at the St. Kliment Ohridski University of Sofia, Faculty of Mathematics and Informatics, Department of Information Technologies, on 26 April 2013.
Resumo:
A numerical method for the Dirichlet initial boundary value problem for the heat equation in the exterior and unbounded region of a smooth closed simply connected 3-dimensional domain is proposed and investigated. This method is based on a combination of a Laguerre transformation with respect to the time variable and an integral equation approach in the spatial variables. Using the Laguerre transformation in time reduces the parabolic problem to a sequence of stationary elliptic problems which are solved by a boundary layer approach giving a sequence of boundary integral equations of the first kind to solve. Under the assumption that the boundary surface of the solution domain has a one-to-one mapping onto the unit sphere, these integral equations are transformed and rewritten over this sphere. The numerical discretisation and solution are obtained by a discrete projection method involving spherical harmonic functions. Numerical results are included.
Resumo:
The sensitivity of the tropics to climate change, particularly the amplitude of glacial-to-interglacial changes in sea surface temperature (SST), is one of the great controversies in paleoclimatology. Here we reassess faunal estimates of ice age SSTs, focusing on the problem of no-analog planktonic foraminiferal assemblages in the equatorial oceans that confounds both classical transfer function and modern analog methods. A new calibration strategy developed here, which uses past variability of species to define robust faunal assemblages, solves the no-analog problem and reveals ice age cooling of 5° to 6°C in the equatorial current systems of the Atlantic and eastern Pacific Oceans. Classical transfer functions underestimated temperature changes in some areas of the tropical oceans because core-top assemblages misrepresented the ice age faunal assemblages. Our finding is consistent with some geochemical estimates and model predictions of greater ice age cooling in the tropics than was inferred by Climate: Long-Range Investigation, Mapping, and Prediction (CLIMAP) [1981] and thus may help to resolve a long-standing controversy. Our new foraminiferal transfer function suggests that such cooling was limited to the equatorial current systems, however, and supports CLIMAP's inference of stability of the subtropical gyre centers.
Resumo:
Integration of the measurement activity into the production process is an essential rule in digital enterprise technology, especially for large volume product manufacturing, such as aerospace, shipbuilding, power generation and automotive industries. Measurement resource planning is a structured method of selecting and deploying necessary measurement resources to implement quality aims of product development. In this research, a new mapping approach for measurement resource planning is proposed. Firstly, quality aims are identified in the form of a number of specifications and engineering requirements of one quality characteristics (QCs) at a specific stage of product life cycle, and also measurement systems are classified according to the attribute of QCs. Secondly, a matrix mapping approach for measurement resource planning is outlined together with an optimization algorithm for combination between quality aims and measurement systems. Finally, the proposed methodology has been studied in shipbuilding to solve the problem of measurement resource planning, by which the measurement resources are deployed to satisfy all the quality aims. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
In a world where students are increasing digitally tethered to powerful, ‘always on’ mobile devices, new models of engagement and approaches to teaching and learning are required from educators. Serious Games (SG) have proved to have instructional potential but there is still a lack of methodologies and tools not only for their design but also to support game analysis and assessment. This paper explores the use of SG to increase student engagement and retention. The development phase of the Circuit Warz game is presented to demonstrate how electronic engineering education can be radically reimagined to create immersive, highly engaging learning experiences that are problem-centered and pedagogically sound. The Learning Mechanics–Game Mechanics (LM-GM) framework for SG game analysis is introduced and its practical use in an educational game design scenario is shown as a case study.
Resumo:
Value-Stream mapping (VSM) is a helpful tool to identify waste and improvement areas. It has emerged as a preferred way to support and implement the lean approach. While lean principles are well-established and have broad applicability in manufacturing, their extension to information technology is still limited. Based on a case study approach, this paper presents the implementation of VSM in an IT firm as a lean IT improvement initiative. It involves mapping the current activities of the firm and identifying opportunities for improvement. After several interviews with employees who are currently involved in the process, current state map is prepared to describe the existing problem areas. Future state map is prepared to show the proposed improvement action plans. The achievements of VSM implementation are reduction in lead time, cycle time and resources. Our finding indicates that, with the new process change, total lead time can be reduced from 20 days to 3 days – 92% reduction in overall lead time for database provisioning process.
Resumo:
This thesis project studies the agent identity privacy problem in the scalar linear quadratic Gaussian (LQG) control system. For the agent identity privacy problem in the LQG control, privacy models and privacy measures have to be established first. It depends on a trajectory of correlated data rather than a single observation. I propose here privacy models and the corresponding privacy measures by taking into account the two characteristics. The agent identity is a binary hypothesis: Agent A or Agent B. An eavesdropper is assumed to make a hypothesis testing on the agent identity based on the intercepted environment state sequence. The privacy risk is measured by the Kullback-Leibler divergence between the probability distributions of state sequences under two hypotheses. By taking into account both the accumulative control reward and privacy risk, an optimization problem of the policy of Agent B is formulated. The optimal deterministic privacy-preserving LQG policy of Agent B is a linear mapping. A sufficient condition is given to guarantee that the optimal deterministic privacy-preserving policy is time-invariant in the asymptotic regime. An independent Gaussian random variable cannot improve the performance of Agent B. The numerical experiments justify the theoretic results and illustrate the reward-privacy trade-off. Based on the privacy model and the LQG control model, I have formulated the mathematical problems for the agent identity privacy problem in LQG. The formulated problems address the two design objectives: to maximize the control reward and to minimize the privacy risk. I have conducted theoretic analysis on the LQG control policy in the agent identity privacy problem and the trade-off between the control reward and the privacy risk.Finally, the theoretic results are justified by numerical experiments. From the numerical results, I expected to have some interesting observations and insights, which are explained in the last chapter.
Biased Random-key Genetic Algorithms For The Winner Determination Problem In Combinatorial Auctions.
Resumo:
Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.
Resumo:
Ecological science contributes to solving a broad range of environmental problems. However, lack of ecological literacy in practice often limits application of this knowledge. In this paper, we highlight a critical but often overlooked demand on ecological literacy: to enable professionals of various careers to apply scientific knowledge when faced with environmental problems. Current university courses on ecology often fail to persuade students that ecological science provides important tools for environmental problem solving. We propose problem-based learning to improve the understanding of ecological science and its usefulness for real-world environmental issues that professionals in careers as diverse as engineering, public health, architecture, social sciences, or management will address. Courses should set clear learning objectives for cognitive skills they expect students to acquire. Thus, professionals in different fields will be enabled to improve environmental decision-making processes and to participate effectively in multidisciplinary work groups charged with tackling environmental issues.
Resumo:
Dulce de leche samples available in the Brazilian market were submitted to sensory profiling by quantitative descriptive analysis and acceptance test, as well sensory evaluation using the just-about-right scale and purchase intent. External preference mapping and the ideal sensory characteristics of dulce de leche were determined. The results were also evaluated by principal component analysis, hierarchical cluster analysis, partial least squares regression, artificial neural networks, and logistic regression. Overall, significant product acceptance was related to intermediate scores of the sensory attributes in the descriptive test, and this trend was observed even after consumer segmentation. The results obtained by sensometric techniques showed that optimizing an ideal dulce de leche from the sensory standpoint is a multidimensional process, with necessary adjustments on the appearance, aroma, taste, and texture attributes of the product for better consumer acceptance and purchase. The optimum dulce de leche was characterized by high scores for the attributes sweet taste, caramel taste, brightness, color, and caramel aroma in accordance with the preference mapping findings. In industrial terms, this means changing the parameters used in the thermal treatment and quantitative changes in the ingredients used in formulations.
Resumo:
The evolution and population dynamics of avian coronaviruses (AvCoVs) remain underexplored. In the present study, in-depth phylogenetic and Bayesian phylogeographic studies were conducted to investigate the evolutionary dynamics of AvCoVs detected in wild and synanthropic birds. A total of 500 samples, including tracheal and cloacal swabs collected from 312 wild birds belonging to 42 species, were analysed using molecular assays. A total of 65 samples (13%) from 22 bird species were positive for AvCoV. Molecular evolution analyses revealed that the sequences from samples collected in Brazil did not cluster with any of the AvCoV S1 gene sequences deposited in the GenBank database. Bayesian framework analysis estimated an AvCoV strain from Sweden (1999) as the most recent common ancestor of the AvCoVs detected in this study. Furthermore, the analysis inferred an increase in the AvCoV dynamic demographic population in different wild and synanthropic bird species, suggesting that birds may be potential new hosts responsible for spreading this virus.
Resumo:
Mapping of elements in biological tissue by laser induced mass spectrometry is a fast growing analytical methodology in life sciences. This method provides a multitude of useful information of metal, nonmetal, metalloid and isotopic distribution at major, minor and trace concentration ranges, usually with a lateral resolution of 12-160 µm. Selected applications in medical research require an improved lateral resolution of laser induced mass spectrometric technique at the low micrometre scale and below. The present work demonstrates the applicability of a recently developed analytical methodology - laser microdissection associated to inductively coupled plasma mass spectrometry (LMD ICP-MS) - to obtain elemental images of different solid biological samples at high lateral resolution. LMD ICP-MS images of mouse brain tissue samples stained with uranium and native are shown, and a direct comparison of LMD and laser ablation (LA) ICP-MS imaging methodologies, in terms of elemental quantification, is performed.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
This paper addresses the capacitated lot sizing problem (CLSP) with a single stage composed of multiple plants, items and periods with setup carry-over among the periods. The CLSP is well studied and many heuristics have been proposed to solve it. Nevertheless, few researches explored the multi-plant capacitated lot sizing problem (MPCLSP), which means that few solution methods were proposed to solve it. Furthermore, to our knowledge, no study of the MPCLSP with setup carry-over was found in the literature. This paper presents a mathematical model and a GRASP (Greedy Randomized Adaptive Search Procedure) with path relinking to the MPCLSP with setup carry-over. This solution method is an extension and adaptation of a previously adopted methodology without the setup carry-over. Computational tests showed that the improvement of the setup carry-over is significant in terms of the solution value with a low increase in computational time.