958 resultados para Courant metric


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a project of the School of Library, Documentation and Information of the National University, is performed to support the initiative of UNESCO to build the Memory of the World (UNESCO, 1995) and to help provide universal access to documentation. To this end, the School of Library Science students has promoted the realization of final graduation work on documentary control of domestic production. This project has the following objectives:Objectives1. Conduct mapping national documentary through the identification, analysis, organization and access to documentary heritage of Costa Rica, to contribute to the Memory of the World.2. Perform bibliometric analysis of documentary records contained in the integrated databases.This project seeks to bring undergraduate students graduating from the school, in making final graduation work on document control. Students have the opportunity to make final graduation work on the documentary production of Costa Rica on a specific theme or on a country's geographical area.Desk audits aimed at identifying the document using access points and indicate its contents to allow recovery by the user.The result is the production of a new document, other than the original, a secondary document: the bibliography. The records in the database each control documentation completed work will be integrated into a single database to be placed on the website of EBDI, for consultation of researchers and interested users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this contribution, we propose a first general definition of rank-metric convolutional codes for multi-shot network coding. To this aim, we introduce a suitable concept of distance and we establish a generalized Singleton bound for this class of codes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to provide a comprehensive study of some linear non-local diffusion problems in metric measure spaces. These include, for example, open subsets in ℝN, graphs, manifolds, multi-structures and some fractal sets. For this, we study regularity, compactness, positivity and the spectrum of the stationary non-local operator. We then study the solutions of linear evolution non-local diffusion problems, with emphasis on similarities and differences with the standard heat equation in smooth domains. In particular, we prove weak and strong maximum principles and describe the asymptotic behaviour using spectral methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information entropy measured from acoustic emission (AE) waveforms is shown to be an indicator of fatigue damage in a high-strength aluminum alloy. Several tension-tension fatigue experiments were performed with dogbone samples of aluminum alloy, Al7075-T6, a commonly used material in aerospace structures. Unlike previous studies in which fatigue damage is simply measured based on visible crack growth, this work investigated fatigue damage prior to crack initiation through the use of instantaneous elastic modulus degradation. Three methods of measuring the AE information entropy, regarded as a direct measure of microstructural disorder, are proposed and compared with traditional damage-related AE features. Results show that one of the three entropy measurement methods appears to better assess damage than the traditional AE features, while the other two entropies have unique trends that can differentiate between small and large cracks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En retraçant le parcours intellectuel de l’historien, moraliste et critique américain Christopher Lasch, ce mémoire vise à mettre en exergue la pertinence et les subtilités de sa pensée politique. Sur la base d’une analyse de ses principaux textes, nous démontrerons, qu’au-delà du pessimisme et du catastrophisme qui lui sont généralement attribués, Lasch porte un regard fécond sur la singularité de l’époque contemporaine. Nous soutiendrons que ses critiques acerbes sur la société et l’individu sont faites, avant tout, dans le but de remédier aux carences morales et sociétales qui auraient engendré un certain idéal libéral progressiste. Selon Lasch, le déploiement continu et illimité de cet idéal est en dissonance avec le caractère essentiellement contingent et conflictuel de la condition humaine. Parallèlement, nous présenterons les incidences psychiques qui se traduisent par une « culture du narcissisme » suscitée notamment par diverses composantes de la société contemporaine. À travers une relecture de la condition humaine, Lasch préconise un correctif idéologique qui est axé sur les notions de limites et d’espoir et qui se trouve au sein de la tradition agraire populiste américaine du 19e siècle. Nous démontrerons ainsi comment ce retour en arrière est entamé dans le but de susciter un renouveau politique et identitaire au sein de la société. L’étude se conclura par une discussion sur la plausibilité de l’idéal populiste, tel que l’entend Lasch, à l’ère du 21e siècle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing in resolution of numerical weather prediction models has allowed more and more realistic forecasts of atmospheric parameters. Due to the growing variability into predicted fields the traditional verification methods are not always able to describe the model ability because they are based on a grid-point-by-grid-point matching between observation and prediction. Recently, new spatial verification methods have been developed with the aim of show the benefit associated to the high resolution forecast. Nested in among of the MesoVICT international project, the initially aim of this work is to compare the newly tecniques remarking advantages and disadvantages. First of all, the MesoVICT basic examples, represented by synthetic precipitation fields, have been examined. Giving an error evaluation in terms of structure, amplitude and localization of the precipitation fields, the SAL method has been studied more thoroughly respect to the others approaches with its implementation in the core cases of the project. The verification procedure has concerned precipitation fields over central Europe: comparisons between the forecasts performed by the 00z COSMO-2 model and the VERA (Vienna Enhanced Resolution Analysis) have been done. The study of these cases has shown some weaknesses of the methodology examined; in particular has been highlighted the presence of a correlation between the optimal domain size and the extention of the precipitation systems. In order to increase ability of SAL, a subdivision of the original domain in three subdomains has been done and the method has been applied again. Some limits have been found in cases in which at least one of the two domains does not show precipitation. The overall results for the subdomains have been summarized on scatter plots. With the aim to identify systematic errors of the model the variability of the three parameters has been studied for each subdomain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En retraçant le parcours intellectuel de l’historien, moraliste et critique américain Christopher Lasch, ce mémoire vise à mettre en exergue la pertinence et les subtilités de sa pensée politique. Sur la base d’une analyse de ses principaux textes, nous démontrerons, qu’au-delà du pessimisme et du catastrophisme qui lui sont généralement attribués, Lasch porte un regard fécond sur la singularité de l’époque contemporaine. Nous soutiendrons que ses critiques acerbes sur la société et l’individu sont faites, avant tout, dans le but de remédier aux carences morales et sociétales qui auraient engendré un certain idéal libéral progressiste. Selon Lasch, le déploiement continu et illimité de cet idéal est en dissonance avec le caractère essentiellement contingent et conflictuel de la condition humaine. Parallèlement, nous présenterons les incidences psychiques qui se traduisent par une « culture du narcissisme » suscitée notamment par diverses composantes de la société contemporaine. À travers une relecture de la condition humaine, Lasch préconise un correctif idéologique qui est axé sur les notions de limites et d’espoir et qui se trouve au sein de la tradition agraire populiste américaine du 19e siècle. Nous démontrerons ainsi comment ce retour en arrière est entamé dans le but de susciter un renouveau politique et identitaire au sein de la société. L’étude se conclura par une discussion sur la plausibilité de l’idéal populiste, tel que l’entend Lasch, à l’ère du 21e siècle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Describes three units of time helpful for understanding and evaluating classificatory structures: long time (versions and states of classification schemes), short time (the act of indexing as repeated ritual or form), and micro-time (where stages of the interpretation process of indexing are separated out and inventoried). Concludes with a short discussion of how time and the impermanence of classification also conjures up an artistic conceptualization of indexing, and briefly uses that to question the seemingly dominant understanding of classification practice as outcome of scientific management and assembly line thought.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the universe of knowledge and subjects change over time, indexing languages like classification schemes, accommodate that change by restructuring. Restructuring indexing languages affects indexer and cataloguer work. Subjects may split or lump together. They may disappear only to reappear later. And new subjects may emerge that were assumed to be already present, but not clearly articulated (Miksa, 1998). In this context we have the complex relationship between the indexing language, the text being described, and the already described collection (Tennis, 2007). It is possible to imagine indexers placing a document into an outdated class, because it is the one they have already used for their collection. However, doing this erases the semantics in the present indexing language. Given this range of choice in the context of indexing language change, the question arises, what does this look like in practice? How often does this occur? Further, what does this phenomenon tell us about subjects in indexing languages? Does the practice we observe in the reaction to indexing language change provide us evidence of conceptual models of subjects and subject creation? If it is incomplete, but gets us close, what evidence do we still require?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a prototype tracking system for tracking people in enclosed indoor environments where there is a high rate of occlusions. The system uses a stereo camera for acquisition, and is capable of disambiguating occlusions using a combination of depth map analysis, a two step ellipse fitting people detection process, the use of motion models and Kalman filters and a novel fit metric, based on computationally simple object statistics. Testing shows that our fit metric outperforms commonly used position based metrics and histogram based metrics, resulting in more accurate tracking of people.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Search engines have forever changed the way people access and discover knowledge, allowing information about almost any subject to be quickly and easily retrieved within seconds. As increasingly more material becomes available electronically the influence of search engines on our lives will continue to grow. This presents the problem of how to find what information is contained in each search engine, what bias a search engine may have, and how to select the best search engine for a particular information need. This research introduces a new method, search engine content analysis, in order to solve the above problem. Search engine content analysis is a new development of traditional information retrieval field called collection selection, which deals with general information repositories. Current research in collection selection relies on full access to the collection or estimations of the size of the collections. Also collection descriptions are often represented as term occurrence statistics. An automatic ontology learning method is developed for the search engine content analysis, which trains an ontology with world knowledge of hundreds of different subjects in a multilevel taxonomy. This ontology is then mined to find important classification rules, and these rules are used to perform an extensive analysis of the content of the largest general purpose Internet search engines in use today. Instead of representing collections as a set of terms, which commonly occurs in collection selection, they are represented as a set of subjects, leading to a more robust representation of information and a decrease of synonymy. The ontology based method was compared with ReDDE (Relevant Document Distribution Estimation method for resource selection) using the standard R-value metric, with encouraging results. ReDDE is the current state of the art collection selection method which relies on collection size estimation. The method was also used to analyse the content of the most popular search engines in use today, including Google and Yahoo. In addition several specialist search engines such as Pubmed and the U.S. Department of Agriculture were analysed. In conclusion, this research shows that the ontology based method mitigates the need for collection size estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Construction is an information intensive industry in which the accuracy and timeliness of information is paramount. It observed that the main communication issue in construction is to provide a method to exchange data between the site operation, the site office and the head office. The information needs under consideration are time critical to assist in maintaining or improving the efficiency at the jobsite. Without appropriate computing support this may increase the difficulty of problem solving. Many researchers focus their research on the usage of mobile computing devices in the construction industry and they believe that mobile computers have the potential to solve some construction problems that leads to reduce overall productivity. However, to date very limited observation has been conducted in terms of the deployment of mobile computers for construction workers on-site. By providing field workers with accurate, reliable and timely information at the location where it is needed, it will support the effectiveness and efficiency at the job site. Bringing a new technology into construction industry is not only need a better understanding of the application, but also need a proper preparation of the allocation of the resources such as people, and investment. With this in mind, an accurate analysis is needed to provide clearly idea of the overall costs and benefits of the new technology. A cost benefit analysis is a method of evaluating the relative merits of a proposed investment project in order to achieve efficient allocation of resources. It is a way of identifying, portraying and assessing the factors which need to be considered in making rational economic choices. In principle, a cost benefit analysis is a rigorous, quantitative and data-intensive procedure, which requires identification all potential effects, categorisation of these effects as costs and benefits, quantitative estimation of the extent of each cost and benefit associated with an action, translation of these into a common metric such as dollars, discounting of future costs and benefits into the terms of a given year, and summary of all cost and benefit to see which is greater. Even though many cost benefit analysis methodologies are available for a general assessment, there is no specific methodology can be applied for analysing the cost and benefit of the application of mobile computing devices in the construction site. Hence, the proposed methodology in this document is predominantly adapted from Baker et al. (2000), Department of Finance (1995), and Office of Investment Management (2005). The methodology is divided into four main stages and then detailed into ten steps. The methodology is provided for the CRC CI 2002-057-C Project: Enabling Team Collaboration with Pervasive and Mobile Computing and can be seen in detail in Section 3.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose To assess the repeatability and validity of lens densitometry derived from the Pentacam Scheimpflug imaging system. Setting Eye Clinic, Queensland University of Technology, Brisbane, Australia. Methods This prospective cross-sectional study evaluated 1 eye of subjects with or without cataract. Scheimpflug measurements and slitlamp and retroillumination photographs were taken through a dilated pupil. Lenses were graded with the Lens Opacities Classification System III. Intraobserver and interobserver reliability of 3 observers performing 3 repeated Scheimpflug lens densitometry measurements each was assessed. Three lens densitometry metrics were evaluated: linear, for which a line was drawn through the visual axis and a mean lens densitometry value given; peak, which is the point at which lens densitometry is greatest on the densitogram; 3-dimensional (3D), in which a fixed, circular 3.0 mm area of the lens is selected and a mean lens densitometry value given. Bland and Altman analysis of repeatability for multiple measures was applied; results were reported as the repeatability coefficient and relative repeatability (RR). Results Twenty eyes were evaluated. Repeatability was high. Overall, interobserver repeatability was marginally lower than intraobserver repeatability. The peak was the least reliable metric (RR 37.31%) and 3D, the most reliable (RR 5.88%). Intraobserver and interobserver lens densitometry values in the cataract group were slightly less repeatable than in the noncataract group. Conclusion The intraobserver and interobserver repeatability of Scheimpflug lens densitometry was high in eyes with cataract and eyes without cataract, which supports the use of automated lens density scoring using the Scheimpflug system evaluated in the study

Relevância:

10.00% 10.00%

Publicador:

Resumo:

De récentes recherches ont mis l’accent sur l’importance pour les nouvelles entreprises internationales de l’existence de ressources et de compétences spécifiques. La présente recherche, qui s’inscrit dans ce courant, montre en particulier l’importance du capital humain acquis par les entrepreneurs sur base de leur expérience internationale passée. Mais nous montrons en même temps que ces organisations sont soutenues par une intention délibérée d’internationalisation dès l’origine. Notre recherche empirique est basée sur l’analyse d’un échantillon de 466 nouvelles entreprises de hautes technologies anglaises et allemandes. Nous montrons que ce capital humain est un actif qui facilite la pénétration rapide des marchés étrangers, et plus encore quand l’entreprise nouvelle est accompagnée d’une intention stratégique délibérée d’internationalisation. Des conclusions similaires peuvent être étendues au niveau des ressources que l’entrepreneur consacre à la start-up : plus ces ressources sont importantes, plus le processus d’internationalisation tend à se faire à grande échelle ; et là aussi, l’influence de ces ressources est augmenté par l’intention stratégique d’internationalisation. Dans le cadre des études empiriques sur les born-globals (entreprises qui démarrent sur un marché globalisé), cette recherche fournit une des premières études empiriques reliant l’influence des conditions initiales de création aux probabilités de croissance internationale rapide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this research, we aim to identify factors that significantly affect the clickthrough of Web searchers. Our underlying goal is determine more efficient methods to optimize the clickthrough rate. We devise a clickthrough metric for measuring customer satisfaction of search engine results using the number of links visited, number of queries a user submits, and rank of clicked links. We use a neural network to detect the significant influence of searching characteristics on future user clickthrough. Our results show that high occurrences of query reformulation, lengthy searching duration, longer query length, and the higher ranking of prior clicked links correlate positively with future clickthrough. We provide recommendations for leveraging these findings for improving the performance of search engine retrieval and result ranking, along with implications for search engine marketing