949 resultados para Cadastral updating


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diese Arbeit beschäftigt sich mit Strukturbildung im schlechten Lösungsmittel bei ein- und zweikomponentigen Polymerbürsten, bei denen Polymerketten durch Pfropfung am Substrat verankert sind. Solche Systeme zeigen laterale Strukturbildungen, aus denen sich interessante Anwendungen ergeben. Die Bewegung der Polymere erfolgt durch Monte Carlo-Simulationen im Kontinuum, die auf CBMC-Algorithmen sowie lokalen Monomerverschiebungen basieren. Eine neu entwickelte Variante des CBMC-Algorithmus erlaubt die Bewegung innerer Kettenteile, da der bisherige Algorithmus die Monomere in Nähe des Pfropfmonomers nicht gut relaxiert. Zur Untersuchung des Phasenverhaltens werden mehrere Analysemethoden entwickelt und angepasst: Dazu gehören die Minkowski-Maße zur Strukturuntersuchung binären Bürsten und die Pfropfkorrelationen zur Untersuchung des Einflusses von Pfropfmustern. Bei einkomponentigen Bürsten tritt die Strukturbildung nur beim schwach gepfropften System auf, dichte Pfropfungen führen zu geschlossenen Bürsten ohne laterale Struktur. Für den graduellen Übergang zwischen geschlossener und aufgerissener Bürste wird ein Temperaturbereich bestimmt, in dem der Übergang stattfindet. Der Einfluss des Pfropfmusters (Störung der Ausbildung einer langreichweitigen Ordnung) auf die Bürstenkonfiguration wird mit den Pfropfkorrelationen ausgewertet. Bei unregelmäßiger Pfropfung sind die gebildeten Strukturen größer als bei regelmäßiger Pfropfung und auch stabiler gegen höhere Temperaturen. Bei binären Systemen bilden sich Strukturen auch bei dichter Pfropfung aus. Zu den Parametern Temperatur, Pfropfdichte und Pfropfmuster kommt die Zusammensetzung der beiden Komponenten hinzu. So sind weitere Strukturen möglich, bei gleicher Häufigkeit der beiden Komponenten bilden sich streifenförmige, lamellare Muster, bei ungleicher Häufigkeit formt die Minoritätskomponente Cluster, die in der Majoritätskomponente eingebettet sind. Selbst bei gleichmäßig gepfropften Systemen bildet sich keine langreichweitige Ordnung aus. Auch bei binären Bürsten hat das Pfropfmuster großen Einfluss auf die Strukturbildung. Unregelmäßige Pfropfmuster führen schon bei höheren Temperaturen zur Trennung der Komponenten, die gebildeten Strukturen sind aber ungleichmäßiger und etwas größer als bei gleichmäßig gepfropften Systemen. Im Gegensatz zur self consistent field-Theorie berücksichtigen die Simulationen Fluktuationen in der Pfropfung und zeigen daher bessere Übereinstimmungen mit dem Experiment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The general theme of the present inquiry concerns the role of training and continuous updating of knowledge and skills in relation to the concept of employability and social vulnerability. The empirical research has affected the entire calendar year 2010, namely from 13 February 2010 to December 31, 2010: data refer to a very specific context or to the course funded by the Emilia Romagna region and targeted to employees in cassintegrazione notwithstanding domiciled in the region. The investigations were performed in a vocational training scheme accredited by the Emilia Romagna for the provision of publicly funded training courses. The quantitative data collected are limited to the region and distributed in all the provinces of Emilia Romagna; It addressed the issue of the role of continuing education throughout life and the importance of updating knowledge and skills, such as privileged instruments to address the instability of the labor market and what strategy to reduce the risk unemployment. Based on the different strategies that the employee puts in place during their professional careers, we introduce two concepts that are more common in the so-called knowledge society, namely the concept of social vulnerability and employability. In modern organizations becomes relevant knowledge they bring workers and the relationships that develop between people and allowing exponentially and disseminate such knowledge and skills. The knowledge thus becomes the first productive force, defined by Davenport and Prusak (1998) as "fluid combination of experience, values, contextual information and specialist knowledge that provides a framework for the evaluation and assimilation of new experience and new information ". Learning at work is a by stable explicit and conscious, and even enjoyable for everyone, especially outside of a training intervention. It then goes on to address the specific issue of training, under the current labor market increasingly deconstructed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study first the different cadastral systems in the EU countries and their perspective in the European Law context, especially in their tax law aspect and about the different building tax system. We talk about the most important aspect, taxation, and the European Unionʼs influence, particularly the European Court. But not only speak about the influence in the Member stateʼs building tax, also find another influences ways, with some European policies. All these aspects and another show a tendency to the cadastral integration, not direct, but existent in the indirect way. About other aspects, the study holds the dual nature of the cadastre, social (like social science), and their tax aspect, and technic nature. The Inspire information net can generate a new way to the tax information exchange between European countries. The investigation end with a comparison of the different cadastral systems in EU countries, and about the edification tax law too. This report holds the tax nature of the cadastre, the need to be considered like social-technic complex. Diverse international organization consider that is a multipurpose instrument and institution, but seem to forget their original purpose, their tax purpose, that was the central aspect in their origin and that don can`t be forget in the new world that raises after the world financial crisis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il contesto nazionale è cambiato recentemente per l’introduzione del nuovo Sistema Geodetico coincidente con quello Europeo (ETRS89, frame ETRF00) e realizzato dalle stazioni della Rete Dinamica Nazionale. Sistema geodetico, associato al cartografico UTM_ETRF00, divenuto per decreto obbligatorio nelle Pubbliche Amministrazioni. Questo cambiamento ha consentito di ottenere rilevamenti dei dati cartografici in coordinate assolute ETRF00 molto più accurate. Quando i dati così rilevati vengono utilizzati per aggiornamenti cartografici perdono le coordinate originarie e vengono adattati a particolari cartografici circostanti. Per progettare una modernizzazione delle mappe catastali e delle carte tecniche finalizzata a consentire l’introduzione degli aggiornamenti senza modificarne le coordinate assolute originarie, lo studio è iniziato valutando come utilizzare sviluppi di strutturazione dei dati topografici presenti nel Database Geotopografico, modellizzazioni 3D di fabbricati nelle esperienze catastali INSPIRE, integrazioni in ambito MUDE tra progetti edilizi e loro realizzazioni. Lo studio è proseguito valutando i servizi di posizionamento in tempo reale NRTK presenti in Italia. Inoltre sono state effettuate sperimentazioni per verificare anche in sede locale la precisione e l’affidabilità dei servizi di posizionamento presenti. La criticità della cartografia catastale deriva sostanzialmente dal due fatti: che originariamente fu inquadrata in 850 Sistemi e successivamente fu trasformata in Roma40 con una esigua densità di punti rimisurati; che fino al 1988 fu aggiornata con modalità non rigorose di bassa qualità. Per risolvere tali criticità si è quindi ipotizzato di sfruttare le modalità di rilevamento NRTK per aumentare localmente la densità dei punti rimisurati e reinquadrare le mappe catastali. Il test, realizzato a Bologna, ha comportato un’analisi preliminare per individuare quali Punti Fiduciali considerare coerenti con le specifiche cartografiche per poi utilizzarli e aumentare localmente la densità dei punti rimisurati. La sperimentazione ha consentito la realizzazione del progetto e di inserire quindi i prossimi aggiornamenti senza modificarne le coordinate ETRF00 ottenute dal servizio di posizionamento.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Advances in biotechnology have shed light on many biological processes. In biological networks, nodes are used to represent the function of individual entities within a system and have historically been studied in isolation. Network structure adds edges that enable communication between nodes. An emerging fieldis to combine node function and network structure to yield network function. One of the most complex networks known in biology is the neural network within the brain. Modeling neural function will require an understanding of networks, dynamics, andneurophysiology. It is with this work that modeling techniques will be developed to work at this complex intersection. Methods: Spatial game theory was developed by Nowak in the context of modeling evolutionary dynamics, or the way in which species evolve over time. Spatial game theory offers a two dimensional view of analyzingthe state of neighbors and updating based on the surroundings. Our work builds upon this foundation by studying evolutionary game theory networks with respect to neural networks. This novel concept is that neurons may adopt a particular strategy that will allow propagation of information. The strategy may therefore act as the mechanism for gating. Furthermore, the strategy of a neuron, as in a real brain, isimpacted by the strategy of its neighbors. The techniques of spatial game theory already established by Nowak are repeated to explain two basic cases and validate the implementation of code. Two novel modifications are introduced in Chapters 3 and 4 that build on this network and may reflect neural networks. Results: The introduction of two novel modifications, mutation and rewiring, in large parametricstudies resulted in dynamics that had an intermediate amount of nodes firing at any given time. Further, even small mutation rates result in different dynamics more representative of the ideal state hypothesized. Conclusions: In both modificationsto Nowak's model, the results demonstrate the network does not become locked into a particular global state of passing all information or blocking all information. It is hypothesized that normal brain function occurs within this intermediate range and that a number of diseases are the result of moving outside of this range.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For various reasons, it is important, if not essential, to integrate the computations and code used in data analyses, methodological descriptions, simulations, etc. with the documents that describe and rely on them. This integration allows readers to both verify and adapt the statements in the documents. Authors can easily reproduce them in the future, and they can present the document's contents in a different medium, e.g. with interactive controls. This paper describes a software framework for authoring and distributing these integrated, dynamic documents that contain text, code, data, and any auxiliary content needed to recreate the computations. The documents are dynamic in that the contents, including figures, tables, etc., can be recalculated each time a view of the document is generated. Our model treats a dynamic document as a master or ``source'' document from which one can generate different views in the form of traditional, derived documents for different audiences. We introduce the concept of a compendium as both a container for the different elements that make up the document and its computations (i.e. text, code, data, ...), and as a means for distributing, managing and updating the collection. The step from disseminating analyses via a compendium to reproducible research is a small one. By reproducible research, we mean research papers with accompanying software tools that allow the reader to directly reproduce the results and employ the methods that are presented in the research paper. Some of the issues involved in paradigms for the production, distribution and use of such reproducible research are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a state-of-the-art application of smoothing for dependent bivariate binomial spatial data to Loa loa prevalence mapping in West Africa. This application is special because it starts with the non-spatial calibration of survey instruments, continues with the spatial model building and assessment and ends with robust, tested software that will be used by the field scientists of the World Health Organization for online prevalence map updating. From a statistical perspective several important methodological issues were addressed: (a) building spatial models that are complex enough to capture the structure of the data but remain computationally usable; (b)reducing the computational burden in the handling of very large covariate data sets; (c) devising methods for comparing spatial prediction methods for a given exceedance policy threshold.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effective techniques for organizing and visualizing large image collections are in growing demand as visual search gets increasingly popular. iMap is a treemap representation for visualizing and navigating image search and clustering results based on the evaluation of image similarity using both visual and textual information. iMap not only makes effective use of available display area to arrange images but also maintains stable update when images are inserted or removed during the query. A key challenge of using iMap lies in the difficult to follow and track the changes when updating the image arrangement as the query image changes. For many information visualization applications, showing the transition when interacting with the data is critically important as it can help users better perceive the changes and understand the underlying data. This work investigates the effectiveness of animated transition in a tiled image layout where the spiral arrangement of the images is based on their similarity. Three aspects of animated transition are considered, including animation steps, animation actions, and flying paths. Exploring and weighting the advantages and disadvantages of different methods for each aspect and in conjunction with the characteristics of the spiral image layout, we present an integrated solution, called AniMap, for animating the transition from an old layout to a new layout when a different image is selected as the query image. To smooth the animation and reduce the overlap among images during the transition, we explore different factors that might have an impact on the animation and propose our solution accordingly. We show the effectiveness of our animated transition solution by demonstrating experimental results and conducting a comparative user study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Part I What makes science hard for newcomers? 1) The background (briefly) of my research - (why the math anxiety model doesn’t fit) 2) The Tier analysis (a visual) – message: there are many types of science learners in your class than simply younger versions of yourself 3) Three approaches (bio, chem, physics) but only one Nature 4) The (different) vocabularies of the three Sciences 5) How mathematics is variously used in Science Part II Rules and rules-driven assignments- lQ vs OQ1) How to incorporate creativity into assignments and tests? 2) Tests- borrowing “thought questions" from other fields (If Columbus hadn't discovered the new World, when and under whose law would it have been discovered?) 3) Grading practices (partial credit, post-exam credit for finding and explaining nontrivial errors 4) Icing on the cake – applications, examples of science/engineering from Tuesdays NY Times Part III Making Change at the Departmental Level 1) Taking control of at least some portion of the curriculum 2) Varying style of presentation 3) Taking control of at least some portion of the exams 4) GRADING pros and cons of grading on a curve 5) Updating labs and lab reporting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conscious events interact with memory systems in learning, rehearsal and retrieval (Ebbinghaus 1885/1964; Tulving 1985). Here we present hypotheses that arise from the IDA computional model (Franklin, Kelemen and McCauley 1998; Franklin 2001b) of global workspace theory (Baars 1988, 2002). Our primary tool for this exploration is a flexible cognitive cycle employed by the IDA computational model and hypothesized to be a basic element of human cognitive processing. Since cognitive cycles are hypothesized to occur five to ten times a second and include interaction between conscious contents and several of the memory systems, they provide the means for an exceptionally fine-grained analysis of various cognitive tasks. We apply this tool to the small effect size of subliminal learning compared to supraliminal learning, to process dissociation, to implicit learning, to recognition vs. recall, and to the availability heuristic in recall. The IDA model elucidates the role of consciousness in the updating of perceptual memory, transient episodic memory, and procedural memory. In most cases, memory is hypothesized to interact with conscious events for its normal functioning. The methodology of the paper is unusual in that the hypotheses and explanations presented are derived from an empirically based, but broad and qualitative computational model of human cognition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Teaching is a dynamic activity. It can be very effective, if its impact is constantly monitored and adjusted to the demands of changing social contexts and needs of learners. This implies that teachers need to be aware about teaching and learning processes. Moreover, they should constantly question their didactical methods and the learning resources, which they provide to their students. They should reflect if their actions are suitable, and they should regulate their teaching, e.g., by updating learning materials based on new knowledge about learners, or by motivating learners to engage in further learning activities. In the last years, a rising interest in ‘learning analytics’ is observable. This interest is motivated by the availability of massive amounts of educational data. Also, the continuously increasing processing power, and a strong motivation for discovering new information from these pools of educational data, is pushing further developments within the learning analytics research field. Learning analytics could be a method for reflective teaching practice that enables and guides teachers to investigate and evaluate their work in future learning scenarios. However, this potentially positive impact has not yet been sufficiently verified by learning analytics research. Another method that pursues these goals is ‘action research’. Learning analytics promises to initiate action research processes because it facilitates awareness, reflection and regulation of teaching activities analogous to action research. Therefore, this thesis joins both concepts, in order to improve the design of learning analytics tools. Central research question of this thesis are: What are the dimensions of learning analytics in relation to action research, which need to be considered when designing a learning analytics tool? How does a learning analytics dashboard impact the teachers of technology-enhanced university lectures regarding ‘awareness’, ‘reflection’ and ‘action’? Does it initiate action research? Which are central requirements for a learning analytics tool, which pursues such effects? This project followed design-based research principles, in order to answer these research questions. The main contributions are: a theoretical reference model that connects action research and learning analytics, the conceptualization and implementation of a learning analytics tool, a requirements catalogue for useful and usable learning analytics design based on evaluations, a tested procedure for impact analysis, and guidelines for the introduction of learning analytics into higher education.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Skin segmentation is a challenging task due to several influences such as unknown lighting conditions, skin colored background, and camera limitations. A lot of skin segmentation approaches were proposed in the past including adaptive (in the sense of updating the skin color online) and non-adaptive approaches. In this paper, we compare three skin segmentation approaches that are promising to work well for hand tracking, which is our main motivation for this work. Hand tracking can widely be used in VR/AR e.g. navigation and object manipulation. The first skin segmentation approach is a well-known non-adaptive approach. It is based on a simple, pre-computed skin color distribution. Methods two and three adaptively estimate the skin color in each frame utilizing clustering algorithms. The second approach uses a hierarchical clustering for a simultaneous image and color space segmentation, while the third approach is a pure color space clustering, but with a more sophisticated clustering approach. For evaluation, we compared the segmentation results of the approaches against a ground truth dataset. To obtain the ground truth dataset, we labeled about 500 images captured under various conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Law collections pose some unique problems in terms of their physical care due to filing and updating practices, use patterns and special binding structures such as loose-leafs and pocket parts. This workshop is designed to address specific preservation needs of law collections through lecture, demonstration and hands-on opportunities. Participants will learn the fundamentals of book repair, treatment options and decision-making, and preservation best practices. Emphasis will be placed on moving knowledge into practice through guidelines for establishing institution-appropriate in house book repair programs, by training the trainers in basic book repair techniques and providing all participants with a start-up tool kit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Even 197 years after its introduction, the notion of polysynthesis remains one of the most intriguing and controversial tools in the (morphological) typologist's toolbox. Several --occasionally contradictory--definitions have been proposed, employed for various purposes, and debated to this very day, but neither practitioners nor theoreticians have reached a comfortable level of consensus regarding its most effective and efficient form yet. The present talk maps the evolution of the notion, discusses its usefulness, and contends that the tool can be made better by updating it minimally with respect to its form and substantially with respect to its conceptual foundations. The former aspect of the update consists of making the notion more precise via explicit qualification; the latter bears relation to our arguably problematic understanding of the pairs lexical vs. grammatical and word vs. clause.