962 resultados para Pattern-matching technique
Resumo:
Developers have an obligation to biodiversity when considering the impact their development may have on the environment, with some choosing to go beyond the legal requirement for planning consent. Climate change projections over the 21st century indicate a climate warming and thus the species selected for habitat creation need to be able to withstand the pressures associated with these forecasts. A process is therefore required to identify resilient plantings for sites subject to climate change. Local government ecologists were consulted on their views on the use of plants of non-native provenance or how they consider resilience to climate change as part of their planting recommendations. There are mixed attitudes towards non-native species, but with studies already showing the impact climate change is having on biodiversity, action needs to be taken to limit further biodiversity loss, particularly given the heavily fragmented landscape preventing natural migration. A methodology has been developed to provide planners and developers with recommendations for plant species that are currently adapted to the climate the UK will experience in the future. A climate matching technique, that employs a GIS, allows the identification of European locations that currently experience the predicted level of climate change at a given UK location. Once an appropriate location has been selected, the plant species present in this area are then investigated for suitability for planting in the UK. The methodology was trialled at one site, Eastern Quarry in Kent, and suitable climate matched locations included areas in north-western France. Through the acquisition of plant species data via site visits and online published material, a species list was created, which considered original habitat design, but with added resilience to climate change.
Resumo:
Purpose - The purpose of this paper is to assess high-dimensional visualisation, combined with pattern matching, as an approach to observing dynamic changes in the ways people tweet about science topics. Design/methodology/approach - The high-dimensional visualisation approach was applied to three scientific topics to test its effectiveness for longitudinal analysis of message framing on Twitter over two disjoint periods in time. The paper uses coding frames to drive categorisation and visual analytics of tweets discussing the science topics. Findings - The findings point to the potential of this mixed methods approach, as it allows sufficiently high sensitivity to recognise and support the analysis of non-trending as well as trending topics on Twitter. Research limitations/implications - Three topics are studied and these illustrate a range of frames, but results may not be representative of all scientific topics. Social implications - Funding bodies increasingly encourage scientists to participate in public engagement. As social media provides an avenue actively utilised for public communication, understanding the nature of the dialog on this medium is important for the scientific community and the public at large. Originality/value - This study differs from standard approaches to the analysis of microblog data, which tend to focus on machine driven analysis large-scale datasets. It provides evidence that this approach enables practical and effective analysis of the content of midsize to large collections of microposts.
Resumo:
To promote regional or mutual improvement, numerous interjurisdictional efforts to share tax bases have been attempted. Most of these efforts fail to be consummated. Motivations to share revenues include: narrowing fiscal disparities, enhancing regional cooperation and economic development, rationalizing land-use, and minimizing revenue losses caused by competition to attract and keep businesses. Various researchers have developed theories to aid understanding of why interjurisdictional cooperation efforts succeed or fail. Walter Rosenbaum and Gladys Kammerer studied two contemporaneous Florida local-government consolidation attempts. Boyd Messinger subsequently tested their Theory of Successful Consolidation on nine consolidation attempts. Paul Peterson's dual theories on Modern Federalism posit that all governmental levels attempt to further economic development and that politicians act in ways that either further their futures or cement job security. Actions related to the latter theory often interfere with the former. Samuel Nunn and Mark Rosentraub sought to learn how interjurisdictional cooperation evolves. Through multiple case studies they developed a model framing interjurisdictional cooperation in four dimensions. ^ This dissertation investigates the ability of the above theories to help predict success or failure of regional tax-base revenue sharing attempts. A research plan was formed that used five sequenced steps to gather data, analyze it, and conclude if hypotheses concerning the application of these theories were valid. The primary analytical tools were: multiple case studies, cross-case analysis, and pattern matching. Data was gathered from historical records, questionnaires, and interviews. ^ The results of this research indicate that Rosenbaum-Kammerer theory can be a predictor of success or failure in implementing tax-base revenue sharing if it is amended as suggested by Messinger and further modified by a recommendation in this dissertation. Peterson's Functional and Legislative theories considered together were able to predict revenue sharing proposal outcomes. Many of the indicators of interjurisdictional cooperation forwarded in the Nunn-Rosentraub model appeared in the cases studied, but the model was not a reliable forecasting instrument. ^
Resumo:
The primary purpose of this thesis was to present a theoretical large-signal analysis to study the power gain and efficiency of a microwave power amplifier for LS-band communications using software simulation. Power gain, efficiency, reliability, and stability are important characteristics in the power amplifier design process. These characteristics affect advance wireless systems, which require low-cost device amplification without sacrificing system performance. Large-signal modeling and input and output matching components are used for this thesis. Motorola's Electro Thermal LDMOS model is a new transistor model that includes self-heating affects and is capable of small-large signal simulations. It allows for most of the design considerations to be on stability, power gain, bandwidth, and DC requirements. The matching technique allows for the gain to be maximized at a specific target frequency. Calculations and simulations for the microwave power amplifier design were performed using Matlab and Microwave Office respectively. Microwave Office is the simulation software used in this thesis. The study demonstrated that Motorola's Electro Thermal LDMOS transistor in microwave power amplifier design process is a viable solution for common-source amplifier applications in high power base stations. The MET-LDMOS met the stability requirements for the specified frequency range without a stability-improvement model. The power gain of the amplifier circuit was improved through proper microwave matching design using input/output-matching techniques. The gain and efficiency of the amplifier improve approximately 4dB and 7.27% respectively. The gain value is roughly .89 dB higher than the maximum gain specified by the MRF21010 data sheet specifications. This work can lead to efficient modeling and development of high power LDMOS transistor implementations in commercial and industry applications.
Resumo:
This work presents an analysis of the behavior of some algorithms usually available in stereo correspondence literature, with full HD images (1920x1080 pixels) to establish, within the precision dilemma versus runtime applications which these methods can be better used. The images are obtained by a system composed of a stereo camera coupled to a computer via a capture board. The OpenCV library is used for computer vision operations and processing images involved. The algorithms discussed are an overall method of search for matching blocks with the Sum of the Absolute Value of the difference (Sum of Absolute Differences - SAD), a global technique based on cutting energy graph cuts, and a so-called matching technique semi -global. The criteria for analysis are processing time, the consumption of heap memory and the mean absolute error of disparity maps generated.
Resumo:
After a productivity decrease of established national export industries in Finland such as mobile and paper industries, innovative, smaller companies with the intentions to internationalize right from the start have been proliferating. For software companies early internationalization is an especially good opportunity, as Internet usage becomes increasingly homogeneous across borders and software products often do not need a physical distribution channel. Globalization also makes Finnish companies turn to unfamiliar export markets like Latin America, a very untraditional market for Finns. Relationships consisting of Finnish and Latin American business partners have therefore not been widely studied, especially from a new-age software company’s perspective. To study these partnerships, relationship marketing theory was taken into the core of the study, as its practice focuses mainly on establishing and maintaining relationships with stakeholders at a profit, so that the objectives of all parties are met, which is done by a mutual exchange and fulfillment of promises. The most important dimensions of relationship marketing were identified as trust, commitment and attraction, which were then focused on, as the study aims to understand the implications Latin American business culture has for the understanding, and hence, effective application of relationship marketing in the Latin American market. The question to be answered consecutively was how should the dimensions of trust, commitment and attraction be understood in business relationships in Latin America? The study was conducted by first joining insights given by Latin American business culture literature with overall theories on the three dimensions. Through pattern matching, these insights were compared to empirical evidence collected from business professionals of the Latin American market and from the experiences of Finnish software businesses that had recently expanded into the market. What was found was that previous literature on Latin American business culture had already named many implications for the relationship marketing dimensions that were relevant also for small Finnish software firms on the market. However, key findings also presented important new drivers for the three constructs. Local presence in the area where the Latin American partner is located was found to drive or enhance trust, commitment and attraction. High-frequency follow up procedures were in turn found to drive commitment and attraction. Both local presence and follow up were defined according to the respective evidence in the study. Also, in the context of Finnish software firms in relationships with Latin American partners, the national origins or the foreignness of the Finnish party was seen to enhance trust and attraction in the relationship
Resumo:
As collections of archived digital documents continue to grow the maintenance of an archive, and the quality of reproduction from the archived format, become important long-term considerations. In particular, Adobe s PDF is now an important final form standard for archiving and distributing electronic versions of technical documents. It is important that all embedded images in the PDF, and any fonts used for text rendering, should at the very minimum be easily readable on screen. Unfortunately, because PDF is based on PostScript technology, it allows the embedding of bitmap fonts in Adobe Type 3 format as well as higher-quality outline fonts in TrueType or Adobe Type 1 formats. Bitmap fonts do not generally perform well when they are scaled and rendered on low-resolution devices such as workstation screens. The work described here investigates how a plug-in to Adobe Acrobat enables bitmap fonts to be substituted by corresponding outline fonts using a checksum matching technique against a canonical set of bitmap fonts, as originally distributed. The target documents for our initial investigations are those PDF files produced by (La)TEXsystems when set up in a default (bitmap font) configuration. For all bitmap fonts where recognition exceeds a certain confidence threshold replacement fonts in Adobe Type 1 (outline) format can be substituted with consequent improvements in file size, screen display quality and rendering speed. The accuracy of font recognition is discussed together with the prospects of extending these methods to bitmap-font PDF files from sources other than (La)TEX.
Resumo:
Document representations can rapidly become unwieldy if they try to encapsulate all possible document properties, ranging from abstract structure to detailed rendering and layout. We present a composite document approach wherein an XMLbased document representation is linked via a shadow tree of bi-directional pointers to a PDF representation of the same document. Using a two-window viewer any material selected in the PDF can be related back to the corresponding material in the XML, and vice versa. In this way the treatment of specialist material such as mathematics, music or chemistry (e.g. via read aloud or play aloud ) can be activated via standard tools working within the XML representation, rather than requiring that application-specific structures be embedded in the PDF itself. The problems of textual recognition and tree pattern matching between the two representations are discussed in detail. Comparisons are drawn between our use of a shadow tree of pointers to map between document representations and the use of a code-replacement shadow tree in technologies such as XBL.
Resumo:
This thesis reports on the development of quantitative measurement using micromachined scanning thermal microscopy (SThM) probes. These thermal probes employ a resistive element at their end, which can be used in passive or active modes. With the help of a review of SThM, the current issues and potentials associated with this technique are revealed. As a consequence of this understanding, several experimental and theoretical methods are discussed, which expand our understanding of these probes. The whole thesis can be summarized into three parts, one focusing on the thermal probe, one on probe-sample thermal interactions, and the third on heat transfer within the sample. In the first part, a series of experiments are demonstrated, aimed at characterizing the probe in its electrical and thermal properties, benefiting advanced probe design, and laying a fundamental base for quantifying the temperature of the probe. The second part focuses on two artifacts observed during the thermal scans – one induced by topography and the other by air conduction. Correspondingly, two devices, probing these artifacts, are developed. A topography-free sample, utilizing a pattern transfer technique, minimises topography-related artifacts that limited the reliability of SThM data; a controlled temperature ‘Johnson noise device’, with multiple-heater design, offers a uniform, accurate, temperature distribution. Analyzing results of scan from these samples provides data for studying the thermal interactions within the probe and the tip-sample interface. In the final part, the observation is presented that quantification of measurements depends not only on an accurate measurement tool, but also on a deep understanding of the heat transfer within the sample resulting from the nanoscopic contact. It is believed that work in this thesis contributes to SThM gaining wider application in the scientific community.
Resumo:
During the last 15 years, the public school system in Bogotá, Colombia has maintained a concession system in which 25 schools are managed privately with exemptions to many of the rules required in the traditional schools -- This study uses the propensity score matching technique to examine whether students in the privately-managed schools have better scores on the Saber 11° examinations taken upon completion of secondary school -- The results for 251 schools indicates that students with comparable socioeconomic characteristics score considerably better on these tests in the privately-managed schools than in the traditional public schools -- Thus, there is evidence that the privately-managed public schools are a cost-effective alternative to the traditional public school
Resumo:
A visibility/invisibility paradox of trust operates in the development of distributed educational leadership for online communities. If trust is to be established, the team-based informal ethos of online collaborative networked communities requires a different kind of leadership from that observed in more formal face-to-face positional hierarchies. Such leadership is more flexible and sophisticated, being capable of encompassing both ambiguity and agile response to change. Online educational leaders need to be partially invisible, delegating discretionary powers, to facilitate the effective distribution of leadership tasks in a highly trusting team-based culture. Yet, simultaneously, online communities are facilitated by the visibility and subtle control effected by expert leaders. This paradox: that leaders need to be both highly visible and invisible when appropriate, was derived during research on 'Trust and Leadership' and tested in the analysis of online community case study discussions using a pattern-matching process to measure conversational interactions. This paper argues that both leader visibility and invisibility are important for effective trusting collaboration in online distributed leadership. Advanced leadership responses to complex situations in online communities foster positive group interaction, mutual trust and effective decision-making, facilitated through the active distribution of tasks.
Resumo:
Most face recognition approaches require a prior training where a given distribution of faces is assumed to further predict the identity of test faces. Such an approach may experience difficulty in identifying faces belonging to distributions different from the one provided during the training. A face recognition technique that performs well regardless of training is, therefore, interesting to consider as a basis of more sophisticated methods. In this work, the Census Transform is applied to describe the faces. Based on a scanning window which extracts local histograms of Census Features, we present a method that directly matches face samples. With this simple technique, 97.2% of the faces in the FERET fa/fb test were correctly recognized. Despite being an easy test set, we have found no other approaches in literature regarding straight comparisons of faces with such a performance. Also, a window for further improvement is presented. Among other techniques, we demonstrate how the use of SVMs over the Census Histogram representation can increase the recognition performance.
Resumo:
Majority of biometric researchers focus on the accuracy of matching using biometrics databases, including iris databases, while the scalability and speed issues have been neglected. In the applications such as identification in airports and borders, it is critical for the identification system to have low-time response. In this paper, a graph-based framework for pattern recognition, called Optimum-Path Forest (OPF), is utilized as a classifier in a pre-developed iris recognition system. The aim of this paper is to verify the effectiveness of OPF in the field of iris recognition, and its performance for various scale iris databases. This paper investigates several classifiers, which are widely used in iris recognition papers, and the response time along with accuracy. The existing Gauss-Laguerre Wavelet based iris coding scheme, which shows perfect discrimination with rotary Hamming distance classifier, is used for iris coding. The performance of classifiers is compared using small, medium, and large scale databases. Such comparison shows that OPF has faster response for large scale database, thus performing better than more accurate but slower Bayesian classifier.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)