903 resultados para Linux kernel
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The growth of maize (Zea mays L.) kernels depends on the availability of carbon (C) and nitrogen (N) assimilates supplied by the mother plant and the capacity of the kernel to use them. Our objectives were to study the effects of N and sucrose supply levels on growth and metabolism of maize kernels. Kernel explants of Pioneer 34RO6 were cultured in vitro with varying combinations of N (5 to 30 mM) and sucrose (117 to 467 mM). Maximum kernel growth was obtained with 10 mM N and 292 mM sucrose in the medium, and a deficiency of one assimilate could not be overcome by a sufficiency of the other. Increasing the N supply led to increases in the kernel sink capacity (number of cells and starch granules in the endosperm), activity of certain enzymes (soluble and bound invertases, sucrose synthase, and aspartate aminotransaminase), starch, and the levels of N compounds (total-N, soluble protein, and free amino acids), and decreased the levels of C metabolites (sucrose and reducing sugars). Conversely, increasing the sucrose supply increased the level of endosperm C metabolites, free amino acids, and ADPG-PPase and alanine transaminase activities, but decreased the activity of soluble invertase and concentrations of soluble protein and total-N. Thus, while C and N are interdependent and essential for accumulation of maximum kernel weight, they appear to regulate growth by different means. Nitrogen supply aids the establishment of kernel sink capacity, and promotes activity of enzymes relating to sucrose and nitrogen uptake, while sucrose regulates the activities df invertase and ADPG-PPase. (C) 1999 Annals of Botany Company.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column
Resumo:
Objetivou-se neste estudo avaliar as características agronômicas, a composição químico-bromatológica e a digestibilidade de 11 cultivares de milho (Zea mays) colhido em duas alturas de corte. As cultivares D 766, D 657, D 1000, P 3021, P 3041, C 805, C 333, AG 5011, FO 01, CO 9621 e BR 205 foram avaliadas quando colhidas 5 cm acima do solo (baixa) e 5 cm abaixo da inserção da primeira espiga (alta). O experimento foi delineado como blocos casualizados, com três repetições, arranjados em esquema fatorial 11 x 2. Os cultivares apresentaram produções semelhantes de matéria seca de forragem e de grãos. As porcentagens das frações colmo, folha, palha, sabugo e grão diferiram entre os cultivares, assim como os teores de matéria seca da planta inteira no momento da colheita. Considerando a planta inteira, apenas os teores de energia bruta, nitrogênio da fração fibra em detergente neutro e a digestibilidade in vitro da fibra em detergente neutro e detergente ácido não diferiram entre os cultivares. O aumento da altura de corte melhorou a qualidade da forragem, devido à redução das frações colmo e folha e dos teores dos constituintes da parede celular.
Resumo:
There are some approaches that take advantage of unused computational resources in the Internet nodes - users´ machines. In the last years , the peer-to-peer networks (P2P) have gaining a momentum mainly due to its support for scalability and fault tolerance. However, current P2P architectures present some problems such as nodes overhead due to messages routing, a great amount of nodes reconfigurations when the network topology changes, routing traffic inside a specific network even when the traffic is not directed to a machine of this network, and the lack of a proximity relationship among the P2P nodes and the proximity of these nodes in the IP network. Although some architectures use the information about the nodes distance in the IP network, they use methods that require dynamic information. In this work we propose a P2P architecture to fix the problems afore mentioned. It is composed of three parts. The first part consists of a basic P2P architecture, called SGrid, which maintains a relationship of nodes in the P2P network with their position in the IP network. Its assigns adjacent key regions to nodes of a same organization. The second part is a protocol called NATal (Routing and NAT application layer) that extends the basic architecture in order to remove from the nodes the responsibility of routing messages. The third part consists of a special kind of node, called LSP (Lightware Super-Peer), which is responsible for maintaining the P2P routing table. In addition, this work also presents a simulator that validates the architecture and a module of the Natal protocol to be used in Linux routers
Resumo:
The Support Vector Machines (SVM) has attracted increasing attention in machine learning area, particularly on classification and patterns recognition. However, in some cases it is not easy to determinate accurately the class which given pattern belongs. This thesis involves the construction of a intervalar pattern classifier using SVM in association with intervalar theory, in order to model the separation of a pattern set between distinct classes with precision, aiming to obtain an optimized separation capable to treat imprecisions contained in the initial data and generated during the computational processing. The SVM is a linear machine. In order to allow it to solve real-world problems (usually nonlinear problems), it is necessary to treat the pattern set, know as input set, transforming from nonlinear nature to linear problem. The kernel machines are responsible to do this mapping. To create the intervalar extension of SVM, both for linear and nonlinear problems, it was necessary define intervalar kernel and the Mercer s theorem (which caracterize a kernel function) to intervalar function
Resumo:
T'his dissertation proposes alternative models to allow the interconnectioin of the data communication networks of COSERN Companhia Energética do Rio Grande do Norte. These networks comprise the oorporative data network, based on TCP/IP architecture, and the automation system linking remote electric energy distribution substations to the main Operatin Centre, based on digital radio links and using the IEC 60870-5-101 protoco1s. The envisaged interconnection aims to provide automation data originated from substations with a contingent route to the Operation Center, in moments of failure or maintenance of the digital radio links. Among the presented models, the one chosen for development consists of a computational prototype based on a standard personal computer, working under LINUX operational system and running na application, developesd in C language, wich functions as a Gateway between the protocols of the TCP/IP stack and the IEC 60870-5-101 suite. So, it is described this model analysis, implementation and tests of functionality and performance. During the test phase it was basically verified the delay introduced by the TCP/IP network when transporting automation data, in order to guarantee that it was cionsistent with the time periods present on the automation network. Besides , additional modules are suggested to the prototype, in order to handle other issues such as security and prioriz\ation of the automation system data, whenever they are travesing the TCP/IP network. Finally, a study hás been done aiming to integrate, in more complete way, the two considered networks. It uses IP platform as a solution of convergence to the communication subsystem of na unified network, as the most recente market tendencies for supervisory and other automation systems indicate
Resumo:
The last years have presented an increase in the acceptance and adoption of the parallel processing, as much for scientific computation of high performance as for applications of general intention. This acceptance has been favored mainly for the development of environments with massive parallel processing (MPP - Massively Parallel Processing) and of the distributed computation. A common point between distributed systems and MPPs architectures is the notion of message exchange, that allows the communication between processes. An environment of message exchange consists basically of a communication library that, acting as an extension of the programming languages that allow to the elaboration of applications parallel, such as C, C++ and Fortran. In the development of applications parallel, a basic aspect is on to the analysis of performance of the same ones. Several can be the metric ones used in this analysis: time of execution, efficiency in the use of the processing elements, scalability of the application with respect to the increase in the number of processors or to the increase of the instance of the treat problem. The establishment of models or mechanisms that allow this analysis can be a task sufficiently complicated considering parameters and involved degrees of freedom in the implementation of the parallel application. An joined alternative has been the use of collection tools and visualization of performance data, that allow the user to identify to points of strangulation and sources of inefficiency in an application. For an efficient visualization one becomes necessary to identify and to collect given relative to the execution of the application, stage this called instrumentation. In this work it is presented, initially, a study of the main techniques used in the collection of the performance data, and after that a detailed analysis of the main available tools is made that can be used in architectures parallel of the type to cluster Beowulf with Linux on X86 platform being used libraries of communication based in applications MPI - Message Passing Interface, such as LAM and MPICH. This analysis is validated on applications parallel bars that deal with the problems of the training of neural nets of the type perceptrons using retro-propagation. The gotten conclusions show to the potentiality and easinesses of the analyzed tools.
Resumo:
In practically all vertical markets and in every region of the planet, loyalty marketers have adopted the tactic of recognition and reward to identify, maintain and increase the yield of their customers. Several strategies have been adopted by companies, and the most popular among them is the loyalty program, which displays a loyalty club to manage these rewards. But the problem with loyalty programs is that customer identification and transfer of loyalty points are made in a semiautomatic. Aiming at this, this paper presents a master's embedded business automation solution called e-Points. The goal of e-Points is munir clubs allegiances with fully automated tooling technology to identify customers directly at the point of sales, ensuring greater control over the loyalty of associate members. For this, we developed a hardware platform with embedded system and RFID technology to be used in PCs tenant, a smart card to accumulate points with every purchase and a web server, which will provide services of interest to retailers and customers membership to the club
Resumo:
They are in this study the experimental results of the analysis of thermal performance of composite material made from a plant matrix of polyurethane derived from castor oil of kernel of mamona (COF) and loading of clay-mineral called vermiculite expanded. Bodies of evidence in the proportions in weight of 10%, 15% and 20% were made to determine the thermal properties: conductivity (k), diffusivity (ά) and heat capacity (C), for purposes of comparison, the measurements were also performed the properties of polyurethane of castor without charge and also the oil polyurethane (PU), both already used in thermal insulation. Plates of 0.25 meters of material analyzed were manufactured for use as insulation material in a chamber performance thermal coverage. Thermocouples were distributed on the surface of the cover, and inside the material inside the test chamber and this in turn was subjected to artificial heating, consisting of a bank of incandescent lamps of 3000 w. The results obtained with the composite materials were compared with data from similar tests conducted with the camera alone with: (a) of oil PU, (b) of COF (c) glass wool, (d ) of rock wool. The heat resistance tests were performed with these composites, obtaining temperature limits for use in the range of 100 º C to 130 º C. Based on the analysis of the results of performance and thermal properties, it was possible to conclude that the COF composites with load of expanded vermiculite present behavior very close to those exhibited by commercial insulation material
Resumo:
Currently there is still a high demand for quality control in manufacturing processes of mechanical parts. This keeps alive the need for the inspection activity of final products ranging from dimensional analysis to chemical composition of products. Usually this task may be done through various nondestructive and destructive methods that ensure the integrity of the parts. The result generated by these modern inspection tools ends up not being able to geometrically define the real damage and, therefore, cannot be properly displayed on a computing environment screen. Virtual 3D visualization may help identify damage that would hardly be detected by any other methods. One may find some commercial softwares that seek to address the stages of a design and simulation of mechanical parts in order to predict possible damages trying to diminish potential undesirable events. However, the challenge of developing softwares capable of integrating the various design activities, product inspection, results of non-destructive testing as well as the simulation of damage still needs the attention of researchers. This was the motivation to conduct a methodological study for implementation of a versatile CAD/CAE computer kernel capable of helping programmers in developing softwares applied to the activities of design and simulation of mechanics parts under stress. In this research it is presented interesting results obtained from the use of the developed kernel showing that it was successfully applied to case studies of design including parts presenting specific geometries, namely: mechanical prostheses, heat exchangers and piping of oil and gas. Finally, the conclusions regarding the experience of merging CAD and CAE theories to develop the kernel, so as to result in a tool adaptable to various applications of the metalworking industry are presented
Resumo:
O texto levanta os perfis epistemológico e socianalítico da questão paradigmática. Mauss evidenciara o moule affectif das noções científicas de força e causa. Posteriormente Baudouin falaria na indução arquetípica das noções e a antropologia do imaginário de Durand concluiria pela indução arquetipal do conceito pela imagem. Chegava-se, assim, ao desvendamento do substrato inconsciente das ideações, de um substrato regido pela catexis vetorializada, traduzindo-se nos valores como cerne das ideações. É o famoso a priori emotivo. Portanto, no texto, questionam-se dois mitos, esteios da ciência clássica: o mito da objetividade científica e o da neutralidade axiológica. Destaca, assim, a falácia da existência de uma ruptura epistemológica entre ciência e ideologia. A partir daí, as ideações tornam-se ideologias, sobretudo nas ciências do homem e nas ciências da educação que, ademais, tornam-se suporte de uma disfarçada luta ideológica, na qual, num colonialismo cognitivo, as estratégias de conhecimento dissimulam as de preconceito. Entretanto, assumir a realidade desse suporte fantasmanalítico e ideológico propicia uma tarefa educativa salutar: os paradigmas tornam-se fantasias e, nessa relativização crítica, podem ser usados como um campo de objetos transicionais coletivos num ludismo cultural e educativo. No policulturalismo da sociedade contemporânea, o politeísmo de valores de Weber transforma-se num politeísmo epistemológico, regido pelo relativismo ontológico de Feyerabend e por uma ética do pragmatismo. Articulando cultura, organização e educação, a antropologia das organizações educativas e a culturanálise de grupos de Paula Carvalho traduzem as heurísticas dessa dialética transicional.
Resumo:
We investigate several diffusion equations which extend the usual one by considering the presence of nonlinear terms or a memory effect on the diffusive term. We also considered a spatial time dependent diffusion coefficient. For these equations we have obtained a new classes of solutions and studied the connection of them with the anomalous diffusion process. We start by considering a nonlinear diffusion equation with a spatial time dependent diffusion coefficient. The solutions obtained for this case generalize the usual one and can be expressed in terms of the q-exponential and q-logarithm functions present in the generalized thermostatistics context (Tsallis formalism). After, a nonlinear external force is considered. For this case the solutions can be also expressed in terms of the q-exponential and q-logarithm functions. However, by a suitable choice of the nonlinear external force, we may have an exponential behavior, suggesting a connection with standard thermostatistics. This fact reveals that these solutions may present an anomalous relaxation process and then, reach an equilibrium state of the kind Boltzmann- Gibbs. Next, we investigate a nonmarkovian linear diffusion equation that presents a kernel leading to the anomalous diffusive process. Particularly, our first choice leads to both a the usual behavior and anomalous behavior obtained through a fractionalderivative equation. The results obtained, within this context, correspond to a change in the waiting-time distribution for jumps in the formalism of random walks. These modifications had direct influence in the solutions, that turned out to be expressed in terms of the Mittag-Leffler or H of Fox functions. In this way, the second moment associated to these distributions led to an anomalous spread of the distribution, in contrast to the usual situation where one finds a linear increase with time
Resumo:
In this work we studied the asymptotic unbiasedness, the strong and the uniform strong consistencies of a class of kernel estimators fn as an estimator of the density function f taking values on a k-dimensional sphere