893 resultados para Simplicity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

ACM Computing Classification System (1998): E.4, C.2.1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Big data comes in various ways, types, shapes, forms and sizes. Indeed, almost all areas of science, technology, medicine, public health, economics, business, linguistics and social science are bombarded by ever increasing flows of data begging to be analyzed efficiently and effectively. In this paper, we propose a rough idea of a possible taxonomy of big data, along with some of the most commonly used tools for handling each particular category of bigness. The dimensionality p of the input space and the sample size n are usually the main ingredients in the characterization of data bigness. The specific statistical machine learning technique used to handle a particular big data set will depend on which category it falls in within the bigness taxonomy. Large p small n data sets for instance require a different set of tools from the large n small p variety. Among other tools, we discuss Preprocessing, Standardization, Imputation, Projection, Regularization, Penalization, Compression, Reduction, Selection, Kernelization, Hybridization, Parallelization, Aggregation, Randomization, Replication, Sequentialization. Indeed, it is important to emphasize right away that the so-called no free lunch theorem applies here, in the sense that there is no universally superior method that outperforms all other methods on all categories of bigness. It is also important to stress the fact that simplicity in the sense of Ockham’s razor non-plurality principle of parsimony tends to reign supreme when it comes to massive data. We conclude with a comparison of the predictive performance of some of the most commonly used methods on a few data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Random fiber lasers blend together attractive features of traditional random lasers, such as low cost and simplicity of fabrication, with high-performance characteristics of conventional fiber lasers, such as good directionality and high efficiency. Low coherence of random lasers is important for speckle-free imaging applications. The random fiber laser with distributed feedback proposed in 2010 led to a quickly developing class of light sources that utilize inherent optical fiber disorder in the form of the Rayleigh scattering and distributed Raman gain. The random fiber laser is an interesting and practically important example of a photonic device based on exploitation of optical medium disorder. We provide an overview of recent advances in this field, including high-power and high-efficiency generation, spectral and statistical properties of random fiber lasers, nonlinear kinetic theory of such systems, and emerging applications in telecommunications and distributed sensing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Color information is widely used in non-destructive quality assessment of perishable horticultural produces. The presented work investigated color changes of pepper (Capsicum annuum L.) samples received from retail system. The effect of storage temperature (10±2°C and 24±4°C) on surface color and firmness was analyzed. Hue spectra was calculated using sum of saturations. A ColorLite sph850 (400-700nm) spectrophotometer was used as reference instrument. Dynamic firmness was measured on three locations of the surface: tip cap, middle and shoulder. Significant effects of storage conditions and surface location on both color and firmness were observed. Hue spectra responded sensitively to color development of pepper. Prediction model (PLS) was used to estimate dynamic firmess based on hue spectra. Accuracy was very different depending on the location. Firmness of the tip cap was predicted with the highest accuracy (RMSEP=0.0335). On the other hand, middle region cannot be used for such purpose. Due to the simplicity and rapid processing, analysis of hue spectra is a promising tool for evaluation of color in postharvest and food industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the latest development in computer science, multivariate data analysis methods became increasingly popular among economists. Pattern recognition in complex economic data and empirical model construction can be more straightforward with proper application of modern softwares. However, despite the appealing simplicity of some popular software packages, the interpretation of data analysis results requires strong theoretical knowledge. This book aims at combining the development of both theoretical and applicationrelated data analysis knowledge. The text is designed for advanced level studies and assumes acquaintance with elementary statistical terms. After a brief introduction to selected mathematical concepts, the highlighting of selected model features is followed by a practice-oriented introduction to the interpretation of SPSS1 outputs for the described data analysis methods. Learning of data analysis is usually time-consuming and requires efforts, but with tenacity the learning process can bring about a significant improvement of individual data analysis skills.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is an analysis of the recruitment process of the Shining Path -SP- and Revolutionary Movement “Túpac Amaru” -MRTA- guerrilla groups. Although SP was considered more aggressive, it gained more followers than MRTA. This thesis tries to explain why. Social Revolution Theory and Social Movement Theory provide explanations based on issues of “poverty”, disregarding the specific characteristics of the guerrilla groups and their supporters, as well as the influence of specific persuasive processes between the leaders of the groups and their followers. Integrative complexity theory, on the contrary, provides a consistent method to analyze cognitive processes: because people tend to reject complex and sophisticated explanations that require mental efforts, simplicity was the key for success. To prove which guerrilla group provided a simpler worldview, a sample of official documents of SP and MRTA are compared. Finally, content analysis is applied through the Paragraph Completion Test (P.C.T.). ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The contributions of this dissertation are in the development of two new interrelated approaches to video data compression: (1) A level-refined motion estimation and subband compensation method for the effective motion estimation and motion compensation. (2) A shift-invariant sub-decimation decomposition method in order to overcome the deficiency of the decimation process in estimating motion due to its shift-invariant property of wavelet transform. ^ The enormous data generated by digital videos call for an intense need of efficient video compression techniques to conserve storage space and minimize bandwidth utilization. The main idea of video compression is to reduce the interpixel redundancies inside and between the video frames by applying motion estimation and motion compensation (MEMO) in combination with spatial transform coding. To locate the global minimum of the matching criterion function reasonably, hierarchical motion estimation by coarse to fine resolution refinements using discrete wavelet transform is applied due to its intrinsic multiresolution and scalability natures. ^ Due to the fact that most of the energies are concentrated in the low resolution subbands while decreased in the high resolution subbands, a new approach called level-refined motion estimation and subband compensation (LRSC) method is proposed. It realizes the possible intrablocks in the subbands for lower entropy coding while keeping the low computational loads of motion estimation as the level-refined method, thus to achieve both temporal compression quality and computational simplicity. ^ Since circular convolution is applied in wavelet transform to obtain the decomposed subframes without coefficient expansion, symmetric-extended wavelet transform is designed on the finite length frame signals for more accurate motion estimation without discontinuous boundary distortions. ^ Although wavelet transformed coefficients still contain spatial domain information, motion estimation in wavelet domain is not as straightforward as in spatial domain due to the shift variance property of the decimation process of the wavelet transform. A new approach called sub-decimation decomposition method is proposed, which maintains the motion consistency between the original frame and the decomposed subframes, improving as a consequence the wavelet domain video compressions by shift invariant motion estimation and compensation. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Through the application of importance- performance analysis (/PA), the author investigated the conceptualization and measurement of service quality for tour operators in the scuba diving industry Findings from a study of consumer perceptions of service quality as they relate to a dive tour operator in Western Australia revealed the core service quality dimensions hat need to be improved for the operator and demonstrated the values and relative simplicity of the importance-performance analyses for dive tour operators generally

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Routine monitoring of environmental pollution demands simplicity and speed without sacrificing sensitivity or accuracy. The development and application of sensitive, fast and easy to implement analytical methodologies for detecting emerging and traditional water and airborne contaminants in South Florida is presented. A novel method was developed for quantification of the herbicide glyphosate based on lyophilization followed by derivatization and simultaneous detection by fluorescence and mass spectrometry. Samples were analyzed from water canals that will hydrate estuarine wetlands of Biscayne National Park, detecting inputs of glyphosate from both aquatic usage and agricultural runoff from farms. A second study describes a set of fast, automated LC-MS/MS protocols for the analysis of dioctyl sulfosuccinate (DOSS) and 2-butoxyethanol, two components of Corexit®. Around 1.8 million gallons of those dispersant formulations were used in the response efforts for the Gulf of Mexico oil spill in 2010. The methods presented here allow the trace-level detection of these compounds in seawater, crude oil and commercial dispersants formulations. In addition, two methodologies were developed for the analysis of well-known pollutants, namely Polycyclic Aromatic Hydrocarbons (PAHs) and airborne particulate matter (APM). PAHs are ubiquitous environmental contaminants and some are potent carcinogens. Traditional GC-MS analysis is labor-intensive and consumes large amounts of toxic solvents. My study provides an alternative automated SPE-LC-APPI-MS/MS analysis with minimal sample preparation and a lower solvent consumption. The system can inject, extract, clean, separate and detect 28 PAHs and 15 families of alkylated PAHs in 28 minutes. The methodology was tested with environmental samples from Miami. Airborne Particulate Matter is a mixture of particles of chemical and biological origin. Assessment of its elemental composition is critical for the protection of sensitive ecosystems and public health. The APM collected from Port Everglades between 2005 and 2010 was analyzed by ICP-MS after acid digestion of filters. The most abundant elements were Fe and Al, followed by Cu, V and Zn. Enrichment factors show that hazardous elements (Cd, Pb, As, Co, Ni and Cr) are introduced by anthropogenic activities. Data suggest that the major sources of APM were an electricity plant, road dust, industrial emissions and marine vessels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents a proposal to build a Mathematics Teaching Laboratory (MTL) whose main theme is the study, construction and use of instruments for navigation and location of mathematical content in an interdisciplinary way approach, and develop a notebook of activities focused on navigational instruments. For this it was necessary a literature review to understand the different conceptions of MTL and its pedagogical implications. The methodology used was literature research, construction and handling of instruments, and pedagogical experimentation. Lorenzato (2006) highlights the importance of an environment and suitable for a professional who can do a good job instruments. The implementation of an LEM can find some obstacles. The lack of support from other teachers or the management, the lack of a suitable place to store the materials produced, the lack of time in the workload of the teacher to prepare the lab activity, etc. Even in unfavorable or adverse conditions, according Lorenzato (2006), its implementation will benefit teachers and students. The lack of teacher training in their initial and continuing education, to use materials, and the lack of manuals with lab activities are also mentioned as factors that keep teachers from MTL. With propóposito assist the teacher of elementary or middle school in building a theme MTL prepared and we are providing a notebook of activities that provides a didactic sequence involving History and Mathematics. The book consists of four accompanied by suggestions for teachers activities, however the teacher has full autonomy to adapt the activities to the reality of your school. Among the instruments of navigation presented in this study chose to build the quadrant due to its simplicity, low cost of material and great teaching potential that this instrument has. But a theme lab is always being built and rebuilt as it is a research environment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The demand for materials with high consistency obtained at relatively low temperatures has been leveraging the search for chemical processes substituents of the conventional ceramic method. This paper aims to obtain nanosized pigments encapsulated (core-shell) the basis of TiO2 doped with transition metals (Fe, Co, Ni, Al) through three (3) methods of synthesis: polymeric precursors (Pechini); hydrothermal microwave, and co-precipitation associated with the sol-gel chemistry. The study was motivated by the simplicity, speed and low power consumption characteristic of these methods. Systems costs are affordable because they allow achieving good control of microstructure, combined with high purity, controlled stoichiometric phases and allowing to obtain particles of nanometer size. The physical, chemical, morphological, structural and optical properties of the materials obtained were analyzed using different techniques for materials characterization. The powder pigments were tested in discoloration and degradation using a photoreactor through the solution of Remazol yellow dye gold (NNI), such as filtration, resulting in a separation of solution and the filter pigments available for further UV-Vis measurements . Different calcination temperatures taken after obtaining the post, the different methods were: 400 º C and 1000 º C. Using a fixed concentration of 10% (Fe, Al, Ni, Co) mass relative to the mass of titanium technologically and economically enabling the study. By transmission electron microscopy (TEM) technique was possible to analyze and confirm the structural formation nanosized particles of encapsulated pigment, TiO2 having the diameter of 20 nm to 100 nm, and thickness of coated layer of Fe, Ni and Co between 2 nm and 10 nm. The method of synthesis more efficient has been studied in the work co-precipitation associated with sol-gel chemistry, in which the best results were achieved without the need for the obtainment of powders the calcination process

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chondroitin sulfate (CS) is a naturally glycosaminoglycan found in the extracellular matrix of connective tissues and it may be extracted and purified those tissues. CS is involved in various biological functions, which may be related to the having structural variability, despite the simplicity of the linear chain structure from this molecule. Researches in biotechnology and pharmaceutical field with wastes from aquaculture has been developed in Brazil. In recent decades, tilapia (Oreochromis niloticus), native fish from Africa, has been one of the most cultivated species in various regions of the world, including Brazil. The tilapia farming is a cost-effective activity, however, it generates large amount of wastes that are discarded by producers. It is understood that waste from tilapia can be used in research as a source of molecules with important biotechnological applications, which also helps in reducing environmental impacts and promote the development of an ecofriendly activity. Thus, nile tilapia viscera were subjected to proteolysis, then the glycosaminoglycans were complexed with ion exchange resin (Lewatit), it was fractionated with increasing volumes of acetone and purified by ion exchange chromatography DEAE-Sephacel. Further, the fraction was analyzed by agarose gel electrophoresis and nuclear magnetic resonance (NMR). The electrophoretic profile of the compound together the analysis of 1H NMR spectra and the HSQC correlation allow to affirm that the compound corresponds to a molecule like chondroitin sulfate. MTT assay was used to assess cell viability in the presence of CS tilapia isolated and showed that the compound is not cytotoxic to normal cells such as cells from the mouse embryo fibroblast (3T3). Then, this compound was tested for the ability to reduce the influx of leukocytes in model of acute peritonitis (in vivo) induced by sodium thioglycolate. In this context, it was done total and differential leukocytes counting in the blood and peritoneal fluid collected respectively from vena cava and the peritoneal cavity of the animals subjected to the experiment. The chondroitin sulfate for the first time isolated from tilapia (CST ) was able to reduce the migration of leukocytes to the peritoneal cavity of inflamed mice until 80.4 per cent at a dose 10µg/kg. The results also show that there was a significant reduction (p<0.001) of the population of polymorphonuclear leukocytes from peritoneal cavity in the three tested doses (0.1µg/kg; 1µg/kg and 10µg/kg) when it was compared to the positive control (just thioglycolate). Therefore, since the CST structure and mechanism of action has been completely elucidated, this compound may have potential for therapeutic use in inflammatory diseases

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building installations of cold water are key parts in any model of housing, are homes or condos. However, these systems are subject to failure, which can range from a leak in a device until faults in the structure of water reservoirs and distribution system. These faults are responsible for great economic and environmental costs. In order to reduce these losses, this work proposes the development of a system able to detect the presence and identify some types of water leaks that may occur. For implementation and testing, consumption model was used in a simulator capable of reproducing a similar behavior to a real model and its consequent failures. The detection of leaks is done based on an expert like model having two detection modules, one active and one passive, which use an array of sensors and actuators (valves) to do the sensing. For testing and implementation has been developed a software capable of coupling the system simulator and detector. From the results it can be seen that the system proposed in this work, as well as functioning satisfactorily, can be easily implemented in microcontrollers or embedded systems due to its simplicity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A typical electrical power system is characterized by centr alization of power gene- ration. However, with the restructuring of the electric sys tem, this topology is changing with the insertion of generators in parallel with the distri bution system (distributed gene- ration) that provides several benefits to be located near to e nergy consumers. Therefore, the integration of distributed generators, especially fro m renewable sources in the Brazi- lian system has been common every year. However, this new sys tem topology may result in new challenges in the field of the power system control, ope ration, and protection. One of the main problems related to the distributed generati on is the islanding formation, witch can result in safety risk to the people and to the power g rid. Among the several islanding protection techniques, passive techniques have low implementation cost and simplicity, requiring only voltage and current measuremen ts to detect system problems. This paper proposes a protection system based on the wavelet transform with overcur- rent and under/overvoltage functions as well as infomation of fault-induced transients in order to provide a fast detection and identification of fault s in the system. The propo- sed protection scheme was evaluated through simulation and experimental studies, with performance similar to the overcurrent and under/overvolt age conventional methods, but with the additional detection of the exact moment of the fault.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Benzylpenicillin (PENG) have been as the active ingredient in veterinary medicinal products, to increase productivity, due to its therapeutic properties. However, one of unfortunate quality and used indiscriminately, resulting in residues in foods exposed to human consumption, especially in milk that is essential to the diet of children and the ageing. Thus, it is indispensable to develop new methods able to detect this waste food, at levels that are toxic to human health, in order to contribute to the food security of consumers and collaborate with regulatory agencies in an efficient inspection. In this work, were developed methods for the quality control of veterinary drugs based on Benzylpenicillin (PENG) that are used in livestock production. Additionally, were validated methodologies for identifying and quantifying the antibiotic residues in milk bovine and caprine. For this, the analytical control was performed two steps. At first, the groups of samples of medicinal products I, II, III, IV and V, individually, were characterized by medium infrared spectroscopy (4000 – 600 cm-1). Besides, 37 samples, distributed in these groups, were analyzed by spectroscopy in the ultraviolet and near infrared region (UV VIS NIR) and Ultra Fast Liquid Chromatograph coupled to linear arrangement photodiodes (UFLC-DAD). The results of the characterization indicated similarities, between PENG and reference standard samples, primarily in regions of 1818 to 1724 cm-1 of ν C=O that shows primary amides features of PENG. The method by UFLC-DAD presented R on 0.9991. LOD of 7.384 × 10-4 μg mL-1. LOQ of 2.049 × 10-3 μg mL-1. The analysis shows that 62.16% the samples presented purity ≥ 81.21%. The method by spectroscopy in the UV VIS NIR presented medium error ≤ 8 – 12% between the reference and experimental criteria, indicating is a secure choice for rapid determination of PENG. In the second stage, was acquiring a method for the extraction and isolation of PENG by the addition of buffer McIlvaine, used for precipitation of proteins total, at pH 4.0. The results showed excellent recovery values PENG, being close to 92.05% of samples of bovine milk (method 1). While samples of milk goats (method 2) the recovery of PENG were 95.83%. The methods for UFLC-DAD have been validated in accordance with the maximum residue limit (LMR) of 4 μg Kg-1 standardized by CAC/GL16. Validation of the method 1 indicated R by 0.9975. LOD of 7.246 × 10-4 μg mL-1. LOQ de 2.196 × 10-3 μg mL-1. The application of the method 1 showed that 12% the samples presented concentration of residues of PENG > LMR. The method 2 indicated R by 0.9995. LOD 8.251 × 10-4 μg mL-1. LOQ de 2.5270 × 10-3 μg mL-1. The application of the method showed that 15% of the samples were above the tolerable. The comparative analysis between the methods pointed better validation for LCP samples, because the reduction of the matrix effect, on this account the tcalculs < ttable, caused by the increase of recovery of the PENG. In this mode, all the operations developed to deliver simplicity, speed, selectivity, reduced analysis time and reagent use and toxic solvents, particularly if compared to the established methodologies.