901 resultados para Interdependent document relevance
Resumo:
We investigate the influence of the driving mechanism on the hysteretic response of systems with athermal dynamics. In the framework of local mean-field theory at finite temperature (but neglecting thermally activated processes), we compare the rate-independent hysteresis loops obtained in the random field Ising model when controlling either the external magnetic field H or the extensive magnetization M. Two distinct behaviors are observed, depending on disorder strength. At large disorder, the H-driven and M-driven protocols yield identical hysteresis loops in the thermodynamic limit. At low disorder, when the H-driven magnetization curve is discontinuous (due to the presence of a macroscopic avalanche), the M-driven loop is reentrant while the induced field exhibits strong intermittent fluctuations and is only weakly self-averaging. The relevance of these results to the experimental observations in ferromagnetic materials, shape memory alloys, and other disordered systems is discussed.
Resumo:
We report here on the magnetic properties of ZnO:Mn- and ZnO:Co-doped nanoparticles. We have found that the ferromagnetism of ZnO:Mn can be switched on and off by consecutive low-temperature annealings in O2 and N2, respectively, while the opposite phenomenology was observed for ZnO:Co. These results suggest that different defects (presumably n-type for ZnO:Co and p-type for ZnO:Mn) are required to induce a ferromagnetic coupling in each case. We will argue that ferromagnetism is likely to be restricted to a very thin, nanometric layer at the grain surface. These findings reveal and give insight into the dramatic relevance of surface effects to the occurrence of ferromagnetism in ZnO-doped oxides.
Resumo:
Epitaxial and fully strained SrRuO3 thin films have been grown on SrTiO3(100). At initial stages the growth mode is three-dimensional- (3D-)like, leading to a finger-shaped structure aligned with the substrate steps and that eventually evolves into a 2D step-flow growth. We study the impact that the defect structure associated with this unique growth mode transition has on the electronic properties of the films. Detailed analysis of the transport properties of nanometric films reveals that microstructural disorder promotes a shortening of the carrier mean free path. Remarkably enough, at low temperatures, this results in a reinforcement of quantum corrections to the conductivity as predicted by recent models of disordered, strongly correlated electronic systems. This finding may provide a simple explanation for the commonly observed¿in conducting oxides-resistivity minima at low temperature. Simultaneously, the ferromagnetic transition occurring at about 140 K, becomes broader as film thickness decreases down to nanometric range. The relevance of these results for the understanding of the electronic properties of disordered electronic systems and for the technological applications of SrRuO3¿and other ferromagnetic and metallic oxides¿is stressed.
Resumo:
The transport and magnetotransport properties of the metallic and ferromagnetic SrRuO3 (SRO) and the metallic and paramagnetic LaNiO3 (LNO) epitaxial thin films have been investigated in fields up to 55 T at temperatures down to 1.8 K . At low temperatures both samples display a well-defined resistivity minimum. We argue that this behavior is due to the increasing relevance of quantum corrections to the conductivity (QCC) as temperature is lowered; this effect being particularly relevant in these oxides due to their short mean free path. However, it is not straightforward to discriminate between contributions of weak localization and renormalization of electron-electron interactions to the QCC through temperature dependence alone. We have taken advantage of the distinct effect of a magnetic field on both mechanisms to demonstrate that in ferromagnetic SRO the weak-localization contribution is suppressed by the large internal field leaving only renormalized electron-electron interactions, whereas in the nonmagnetic LNO thin films the weak-localization term is relevant.
Resumo:
The effect of the local environment on the energetic strain within small (SiO)N rings (with N=2,3) in silica materials is investigated via periodic model systems employing density functional calculations. Through comparison of the energies of various nonterminated systems containing small rings in strained and relatively unstrained environments, with alpha quartz, we demonstrate how small ring strain is affected by the nature of the embedding environment. We compare our findings with numerous previously reported calculations, often predicting significantly different small-ring strain energies, leading to a critical assessment of methods of calculating accurate localized ring energies. The results have relevance for estimates of the strain-induced response (e.g., chemical, photo, and radio) of small silica rings, and the propensity for them to form in bulk glasses, thin films, and nanoclusters.
Resumo:
The work is intended to study the following important aspects of document image processing and develop new methods. (1) Segmentation ofdocument images using adaptive interval valued neuro-fuzzy method. (2) Improving the segmentation procedure using Simulated Annealing technique. (3) Development of optimized compression algorithms using Genetic Algorithm and parallel Genetic Algorithm (4) Feature extraction of document images (5) Development of IV fuzzy rules. This work also helps for feature extraction and foreground and background identification. The proposed work incorporates Evolutionary and hybrid methods for segmentation and compression of document images. A study of different neural networks used in image processing, the study of developments in the area of fuzzy logic etc is carried out in this work
Resumo:
This work proposes a parallel genetic algorithm for compressing scanned document images. A fitness function is designed with Hausdorff distance which determines the terminating condition. The algorithm helps to locate the text lines. A greater compression ratio has achieved with lesser distortion
Resumo:
Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education
Resumo:
The rejection of the European Constitution marks an important crystallization point for debate about the European Union (EU) and the integration process. The European Constitution was envisaged as the founding document of a renewed and enlarged European Union and thus it was rather assumed to find wide public support. Its rejection was not anticipated. The negative referenda in France and the Netherlands therefore led to a controversial debate about the more fundamental meaning and the consequences of the rejection both for the immediate state of affairs as well as for the further integration process. The rejection of the Constitution and the controversy about its correct interpretation therefore present an intriguing puzzle for political analysis. Although the treaty rejection was taken up widely in the field of European Studies, the focus of existing analyses has predominantly been on explaining why the current situation occurred. Underlying these approaches is the premise that by establishing the reasons for the rejection it is possible to derive the ‘true’ meaning of the event for the EU integration process. In my paper I rely on an alternative, discourse theoretical approach which aims to overcome the positivist perspective dominating the existing analyses. I argue that the meaning of the event ‘treaty rejection’ is not fixed or inherent to it but discursively constructed. The critical assessment of this concrete meaning-production is of high relevance as the specific meaning attributed to the treaty rejection effectively constrains the scope for supposedly ‘reasonable’ options for action, both in the concrete situation and in the further European integration process more generally. I will argue that the overall framing suggests a fundamental technocratic approach to governance from part of the Commission. Political struggle and public deliberation is no longer foreseen as the concrete solutions to the citizens’ general concerns are designed by supposedly apolitical experts. Through the communicative diffusion and the active implementation of this particular model of governance the Commission shapes the future integration process in a more substantial way than is obvious from its seemingly limited immediate problem-solving orientation of overcoming the ‘constitutional crisis’. As the European Commission is a central actor in the discourse production my analysis focuses on the specific interpretation of the situation put forward by the Commission. In order to work out the Commission’s particular take on the event I conducted a frame analysis (according to Benford/Snow) on a body of key sources produced in the context of coping with the treaty rejection.
Resumo:
Traditionally, we've focussed on the question of how to make a system easy to code the first time, or perhaps on how to ease the system's continued evolution. But if we look at life cycle costs, then we must conclude that the important question is how to make a system easy to operate. To do this we need to make it easy for the operators to see what's going on and to then manipulate the system so that it does what it is supposed to. This is a radically different criterion for success. What makes a computer system visible and controllable? This is a difficult question, but it's clear that today's modern operating systems with nearly 50 million source lines of code are neither. Strikingly, the MIT Lisp Machine and its commercial successors provided almost the same functionality as today's mainstream sytsems, but with only 1 Million lines of code. This paper is a retrospective examination of the features of the Lisp Machine hardware and software system. Our key claim is that by building the Object Abstraction into the lowest tiers of the system, great synergy and clarity were obtained. It is our hope that this is a lesson that can impact tomorrow's designs. We also speculate on how the spirit of the Lisp Machine could be extended to include a comprehensive access control model and how new layers of abstraction could further enrich this model.
Resumo:
The Bologna Process defends the adoption of a higher education in teaching-learning methodologies that – in contraposition to the previous model based on the transmission of knowledge, which for being essentially theoretical, gives the student a passive role in the knowledge construction process – allows a (pro) active, autonomous and practical learning, where the student acquires and develops his competences. The personal tutorial guidance sessions are included in the teaching contact hours. This abstract presents a study about the University of Minho (first cycle) Courses Students’ perceptions of the personal tutorial guidance sessions’ relevance in the scope of the learning-teaching process, so as to confirm if the implementation/implantation of the commonly called tutorial (type) education, as an approach to an active, autonomous and practical learning, is sensed by the learners themselves
Resumo:
Based on Rijt-Plooij and Plooij’s (1992) research on emergence of regression periods in the first two years of life, the presence of such periods in a group of 18 babies (10 boys and 8 girls, aged between 3 weeks and 14 months) from a Catalonian population was analyzed. The measurements were a questionnaire filled in by the infants’ mothers, a semi-structured weekly tape-recorded interview, and observations in their homes. The procedure and the instruments used in the project follow those proposed by Rijt-Plooij and Plooij. Our results confirm the existence of the regression periods in the first year of children’s life. Inter-coder agreement for trained coders was 78.2% and within-coder agreement was 90.1 %. In the discussion, the possible meaning and relevance of regression periods in order to understand development from a psychobiological and social framework is commented upon
Resumo:
We investigate whether dimensionality reduction using a latent generative model is beneficial for the task of weakly supervised scene classification. In detail, we are given a set of labeled images of scenes (for example, coast, forest, city, river, etc.), and our objective is to classify a new image into one of these categories. Our approach consists of first discovering latent ";topics"; using probabilistic Latent Semantic Analysis (pLSA), a generative model from the statistical text literature here applied to a bag of visual words representation for each image, and subsequently, training a multiway classifier on the topic distribution vector for each image. We compare this approach to that of representing each image by a bag of visual words vector directly and training a multiway classifier on these vectors. To this end, we introduce a novel vocabulary using dense color SIFT descriptors and then investigate the classification performance under changes in the size of the visual vocabulary, the number of latent topics learned, and the type of discriminative classifier used (k-nearest neighbor or SVM). We achieve superior classification performance to recent publications that have used a bag of visual word representation, in all cases, using the authors' own data sets and testing protocols. We also investigate the gain in adding spatial information. We show applications to image retrieval with relevance feedback and to scene classification in videos
Resumo:
This paper describes a method to achieve the most relevant contours of an image. The presented method proposes to integrate the information of the local contours from chromatic components such as H, S and I, taking into account the criteria of coherence of the local contour orientation values obtained from each of these components. The process is based on parametrizing pixel by pixel the local contours (magnitude and orientation values) from the H, S and I images. This process is carried out individually for each chromatic component. If the criterion of dispersion of the obtained orientation values is high, this chromatic component will lose relevance. A final processing integrates the extracted contours of the three chromatic components, generating the so-called integrated contours image
Resumo:
El llibre ressenyat pretén ser una síntesi i una avaluació de la situació de la geografia als Estats Units. Estructurat en vuit capítols, els sis primers són una anàlisi de diversos aspectes de la geografia i els dos últims expliquen quina estratègia hauria de portar-se a terme per enfortir el paper de la geografia en el món acadèmic i en la societat en general