912 resultados para simplicity
Resumo:
This work was supported by the Bulgarian National Science Fund under grant BY-TH-105/2005.
Resumo:
In 2000 A. Alesina and M. Galuzzi presented Vincent’s theorem “from a modern point of view” along with two new bisection methods derived from it, B and C. Their profound understanding of Vincent’s theorem is responsible for simplicity — the characteristic property of these two methods. In this paper we compare the performance of these two new bisection methods — i.e. the time they take, as well as the number of intervals they examine in order to isolate the real roots of polynomials — against that of the well-known Vincent-Collins-Akritas method, which is the first bisection method derived from Vincent’s theorem back in 1976. Experimental results indicate that REL, the fastest implementation of the Vincent-Collins-Akritas method, is still the fastest of the three bisection methods, but the number of intervals it examines is almost the same as that of B. Therefore, further research on speeding up B while preserving its simplicity looks promising.
Resumo:
As shown recently, a long telecommunication fibre may be treated as a natural one-dimensional random system, where lasing is possible due to a combination of random distributed feedback via Rayleigh scattering by natural refractive index inhomogeneities and distributed amplification through the Raman effect. Here we present a new type of a random fibre laser with a narrow (∼1 nm) spectrum tunable over a broad wavelength range (1535-1570 nm) with a uniquely flat (∼0.1 dB) and high (>2 W) output power and prominent (>40 %) differential efficiency, which outperforms traditional fibre lasers of the same category, e.g. a conventional Raman laser with a linear cavity formed in the same fibre by adding point reflectors. Analytical model is proposed that explains quantitatively the higher efficiency and the flatter tuning curve of the random fiber laser compared to conventional one. The other important features of the random fibre laser like "modeless" spectrum of specific shape and corresponding intensity fluctuations as well as the techniques of controlling its output characteristics are discussed. Outstanding characteristics defined by new underlying physics and the simplicity of the scheme implemented in standard telecom fibre make the demonstrated tunable random fibre laser a very attractive light source both for fundamental science and practical applications such as optical communication, sensing and secure transmission. © 2012 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).
Resumo:
We demonstrate that an interplay between diffraction and defocusing nonlinearity can support stable self-similar plasmonic waves with a parabolic profile. Simplicity of a parabolic shape combined with the corresponding parabolic spatial phase distribution creates opportunities for controllable manipulation of plasmons through a combined action of diffraction and nonlinearity. © 2013 Optical Society of America.
Resumo:
Lexicon-based approaches to Twitter sentiment analysis are gaining much popularity due to their simplicity, domain independence, and relatively good performance. These approaches rely on sentiment lexicons, where a collection of words are marked with fixed sentiment polarities. However, words' sentiment orientation (positive, neural, negative) and/or sentiment strengths could change depending on context and targeted entities. In this paper we present SentiCircle; a novel lexicon-based approach that takes into account the contextual and conceptual semantics of words when calculating their sentiment orientation and strength in Twitter. We evaluate our approach on three Twitter datasets using three different sentiment lexicons. Results show that our approach significantly outperforms two lexicon baselines. Results are competitive but inconclusive when comparing to state-of-art SentiStrength, and vary from one dataset to another. SentiCircle outperforms SentiStrength in accuracy on average, but falls marginally behind in F-measure. © 2014 Springer International Publishing.
Resumo:
ACM Computing Classification System (1998): E.4, C.2.1.
Resumo:
Big data comes in various ways, types, shapes, forms and sizes. Indeed, almost all areas of science, technology, medicine, public health, economics, business, linguistics and social science are bombarded by ever increasing flows of data begging to be analyzed efficiently and effectively. In this paper, we propose a rough idea of a possible taxonomy of big data, along with some of the most commonly used tools for handling each particular category of bigness. The dimensionality p of the input space and the sample size n are usually the main ingredients in the characterization of data bigness. The specific statistical machine learning technique used to handle a particular big data set will depend on which category it falls in within the bigness taxonomy. Large p small n data sets for instance require a different set of tools from the large n small p variety. Among other tools, we discuss Preprocessing, Standardization, Imputation, Projection, Regularization, Penalization, Compression, Reduction, Selection, Kernelization, Hybridization, Parallelization, Aggregation, Randomization, Replication, Sequentialization. Indeed, it is important to emphasize right away that the so-called no free lunch theorem applies here, in the sense that there is no universally superior method that outperforms all other methods on all categories of bigness. It is also important to stress the fact that simplicity in the sense of Ockham’s razor non-plurality principle of parsimony tends to reign supreme when it comes to massive data. We conclude with a comparison of the predictive performance of some of the most commonly used methods on a few data sets.
Resumo:
Random fiber lasers blend together attractive features of traditional random lasers, such as low cost and simplicity of fabrication, with high-performance characteristics of conventional fiber lasers, such as good directionality and high efficiency. Low coherence of random lasers is important for speckle-free imaging applications. The random fiber laser with distributed feedback proposed in 2010 led to a quickly developing class of light sources that utilize inherent optical fiber disorder in the form of the Rayleigh scattering and distributed Raman gain. The random fiber laser is an interesting and practically important example of a photonic device based on exploitation of optical medium disorder. We provide an overview of recent advances in this field, including high-power and high-efficiency generation, spectral and statistical properties of random fiber lasers, nonlinear kinetic theory of such systems, and emerging applications in telecommunications and distributed sensing.
Resumo:
Color information is widely used in non-destructive quality assessment of perishable horticultural produces. The presented work investigated color changes of pepper (Capsicum annuum L.) samples received from retail system. The effect of storage temperature (10±2°C and 24±4°C) on surface color and firmness was analyzed. Hue spectra was calculated using sum of saturations. A ColorLite sph850 (400-700nm) spectrophotometer was used as reference instrument. Dynamic firmness was measured on three locations of the surface: tip cap, middle and shoulder. Significant effects of storage conditions and surface location on both color and firmness were observed. Hue spectra responded sensitively to color development of pepper. Prediction model (PLS) was used to estimate dynamic firmess based on hue spectra. Accuracy was very different depending on the location. Firmness of the tip cap was predicted with the highest accuracy (RMSEP=0.0335). On the other hand, middle region cannot be used for such purpose. Due to the simplicity and rapid processing, analysis of hue spectra is a promising tool for evaluation of color in postharvest and food industry.
Resumo:
With the latest development in computer science, multivariate data analysis methods became increasingly popular among economists. Pattern recognition in complex economic data and empirical model construction can be more straightforward with proper application of modern softwares. However, despite the appealing simplicity of some popular software packages, the interpretation of data analysis results requires strong theoretical knowledge. This book aims at combining the development of both theoretical and applicationrelated data analysis knowledge. The text is designed for advanced level studies and assumes acquaintance with elementary statistical terms. After a brief introduction to selected mathematical concepts, the highlighting of selected model features is followed by a practice-oriented introduction to the interpretation of SPSS1 outputs for the described data analysis methods. Learning of data analysis is usually time-consuming and requires efforts, but with tenacity the learning process can bring about a significant improvement of individual data analysis skills.
Resumo:
This thesis is an analysis of the recruitment process of the Shining Path -SP- and Revolutionary Movement “Túpac Amaru” -MRTA- guerrilla groups. Although SP was considered more aggressive, it gained more followers than MRTA. This thesis tries to explain why. Social Revolution Theory and Social Movement Theory provide explanations based on issues of “poverty”, disregarding the specific characteristics of the guerrilla groups and their supporters, as well as the influence of specific persuasive processes between the leaders of the groups and their followers. Integrative complexity theory, on the contrary, provides a consistent method to analyze cognitive processes: because people tend to reject complex and sophisticated explanations that require mental efforts, simplicity was the key for success. To prove which guerrilla group provided a simpler worldview, a sample of official documents of SP and MRTA are compared. Finally, content analysis is applied through the Paragraph Completion Test (P.C.T.). ^
Resumo:
The contributions of this dissertation are in the development of two new interrelated approaches to video data compression: (1) A level-refined motion estimation and subband compensation method for the effective motion estimation and motion compensation. (2) A shift-invariant sub-decimation decomposition method in order to overcome the deficiency of the decimation process in estimating motion due to its shift-invariant property of wavelet transform. ^ The enormous data generated by digital videos call for an intense need of efficient video compression techniques to conserve storage space and minimize bandwidth utilization. The main idea of video compression is to reduce the interpixel redundancies inside and between the video frames by applying motion estimation and motion compensation (MEMO) in combination with spatial transform coding. To locate the global minimum of the matching criterion function reasonably, hierarchical motion estimation by coarse to fine resolution refinements using discrete wavelet transform is applied due to its intrinsic multiresolution and scalability natures. ^ Due to the fact that most of the energies are concentrated in the low resolution subbands while decreased in the high resolution subbands, a new approach called level-refined motion estimation and subband compensation (LRSC) method is proposed. It realizes the possible intrablocks in the subbands for lower entropy coding while keeping the low computational loads of motion estimation as the level-refined method, thus to achieve both temporal compression quality and computational simplicity. ^ Since circular convolution is applied in wavelet transform to obtain the decomposed subframes without coefficient expansion, symmetric-extended wavelet transform is designed on the finite length frame signals for more accurate motion estimation without discontinuous boundary distortions. ^ Although wavelet transformed coefficients still contain spatial domain information, motion estimation in wavelet domain is not as straightforward as in spatial domain due to the shift variance property of the decimation process of the wavelet transform. A new approach called sub-decimation decomposition method is proposed, which maintains the motion consistency between the original frame and the decomposed subframes, improving as a consequence the wavelet domain video compressions by shift invariant motion estimation and compensation. ^
Resumo:
Through the application of importance- performance analysis (/PA), the author investigated the conceptualization and measurement of service quality for tour operators in the scuba diving industry Findings from a study of consumer perceptions of service quality as they relate to a dive tour operator in Western Australia revealed the core service quality dimensions hat need to be improved for the operator and demonstrated the values and relative simplicity of the importance-performance analyses for dive tour operators generally
Resumo:
Routine monitoring of environmental pollution demands simplicity and speed without sacrificing sensitivity or accuracy. The development and application of sensitive, fast and easy to implement analytical methodologies for detecting emerging and traditional water and airborne contaminants in South Florida is presented. A novel method was developed for quantification of the herbicide glyphosate based on lyophilization followed by derivatization and simultaneous detection by fluorescence and mass spectrometry. Samples were analyzed from water canals that will hydrate estuarine wetlands of Biscayne National Park, detecting inputs of glyphosate from both aquatic usage and agricultural runoff from farms. A second study describes a set of fast, automated LC-MS/MS protocols for the analysis of dioctyl sulfosuccinate (DOSS) and 2-butoxyethanol, two components of Corexit®. Around 1.8 million gallons of those dispersant formulations were used in the response efforts for the Gulf of Mexico oil spill in 2010. The methods presented here allow the trace-level detection of these compounds in seawater, crude oil and commercial dispersants formulations. In addition, two methodologies were developed for the analysis of well-known pollutants, namely Polycyclic Aromatic Hydrocarbons (PAHs) and airborne particulate matter (APM). PAHs are ubiquitous environmental contaminants and some are potent carcinogens. Traditional GC-MS analysis is labor-intensive and consumes large amounts of toxic solvents. My study provides an alternative automated SPE-LC-APPI-MS/MS analysis with minimal sample preparation and a lower solvent consumption. The system can inject, extract, clean, separate and detect 28 PAHs and 15 families of alkylated PAHs in 28 minutes. The methodology was tested with environmental samples from Miami. Airborne Particulate Matter is a mixture of particles of chemical and biological origin. Assessment of its elemental composition is critical for the protection of sensitive ecosystems and public health. The APM collected from Port Everglades between 2005 and 2010 was analyzed by ICP-MS after acid digestion of filters. The most abundant elements were Fe and Al, followed by Cu, V and Zn. Enrichment factors show that hazardous elements (Cd, Pb, As, Co, Ni and Cr) are introduced by anthropogenic activities. Data suggest that the major sources of APM were an electricity plant, road dust, industrial emissions and marine vessels.
Resumo:
This work presents a proposal to build a Mathematics Teaching Laboratory (MTL) whose main theme is the study, construction and use of instruments for navigation and location of mathematical content in an interdisciplinary way approach, and develop a notebook of activities focused on navigational instruments. For this it was necessary a literature review to understand the different conceptions of MTL and its pedagogical implications. The methodology used was literature research, construction and handling of instruments, and pedagogical experimentation. Lorenzato (2006) highlights the importance of an environment and suitable for a professional who can do a good job instruments. The implementation of an LEM can find some obstacles. The lack of support from other teachers or the management, the lack of a suitable place to store the materials produced, the lack of time in the workload of the teacher to prepare the lab activity, etc. Even in unfavorable or adverse conditions, according Lorenzato (2006), its implementation will benefit teachers and students. The lack of teacher training in their initial and continuing education, to use materials, and the lack of manuals with lab activities are also mentioned as factors that keep teachers from MTL. With propóposito assist the teacher of elementary or middle school in building a theme MTL prepared and we are providing a notebook of activities that provides a didactic sequence involving History and Mathematics. The book consists of four accompanied by suggestions for teachers activities, however the teacher has full autonomy to adapt the activities to the reality of your school. Among the instruments of navigation presented in this study chose to build the quadrant due to its simplicity, low cost of material and great teaching potential that this instrument has. But a theme lab is always being built and rebuilt as it is a research environment