922 resultados para multi-column process


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The globular cluster HP 1 is projected on the bulge, very close to the Galactic center. The Multi-Conjugate Adaptive Optics Demonstrator on the Very Large Telescope allowed us to acquire high-resolution deep images that, combined with first epoch New Technology Telescope data, enabled us to derive accurate proper motions. The cluster and bulge fields` stellar contents were disentangled through this process and produced an unprecedented definition in color-magnitude diagrams of this cluster. The metallicity of [Fe/H] approximate to -1.0 from previous spectroscopic analysis is confirmed, which together with an extended blue horizontal branch imply an age older than the halo average. Orbit reconstruction results suggest that HP 1 is spatially confined within the bulge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Successful classification, information retrieval and image analysis tools are intimately related with the quality of the features employed in the process. Pixel intensities, color, texture and shape are, generally, the basis from which most of the features are Computed and used in such fields. This papers presents a novel shape-based feature extraction approach where an image is decomposed into multiple contours, and further characterized by Fourier descriptors. Unlike traditional approaches we make use of topological knowledge to generate well-defined closed contours, which are efficient signatures for image retrieval. The method has been evaluated in the CBIR context and image analysis. The results have shown that the multi-contour decomposition, as opposed to a single shape information, introduced a significant improvement in the discrimination power. (c) 2008 Elsevier B.V. All rights reserved,

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet of Things är ett samlingsbegrepp för den utveckling som innebär att olika typer av enheter kan förses med sensorer och datachip som är uppkopplade mot internet. En ökad mängd data innebär en ökad förfrågan på lösningar som kan lagra, spåra, analysera och bearbeta data. Ett sätt att möta denna förfrågan är att använda sig av molnbaserade realtidsanalystjänster. Multi-tenant och single-tenant är två typer av arkitekturer för molnbaserade realtidsanalystjänster som kan användas för att lösa problemen med hanteringen av de ökade datamängderna. Dessa arkitekturer skiljer sig åt när det gäller komplexitet i utvecklingen. I detta arbete representerar Azure Stream Analytics en multi-tenant arkitektur och HDInsight/Storm representerar en single-tenant arkitektur. För att kunna göra en jämförelse av molnbaserade realtidsanalystjänster med olika arkitekturer, har vi valt att använda oss av användbarhetskriterierna: effektivitet, ändamålsenlighet och användarnöjdhet. Vi kom fram till att vi ville ha svar på följande frågor relaterade till ovannämnda tre användbarhetskriterier: • Vilka likheter och skillnader kan vi se i utvecklingstider? • Kan vi identifiera skillnader i funktionalitet? • Hur upplever utvecklare de olika analystjänsterna? Vi har använt en design and creation strategi för att utveckla två Proof of Concept prototyper och samlat in data genom att använda flera datainsamlingsmetoder. Proof of Concept prototyperna inkluderade två artefakter, en för Azure Stream Analytics och en för HDInsight/Storm. Vi utvärderade dessa genom att utföra fem olika scenarier som var för sig hade 2-5 delmål. Vi simulerade strömmande data genom att låta en applikation kontinuerligt slumpa fram data som vi analyserade med hjälp av de två realtidsanalystjänsterna. Vi har använt oss av observationer för att dokumentera hur vi arbetade med utvecklingen av analystjänsterna samt för att mäta utvecklingstider och identifiera skillnader i funktionalitet. Vi har även använt oss av frågeformulär för att ta reda på vad användare tyckte om analystjänsterna. Vi kom fram till att Azure Stream Analytics initialt var mer användbart än HDInsight/Storm men att skillnaderna minskade efter hand. Azure Stream Analytics var lättare att arbeta med vid simplare analyser medan HDInsight/Storm hade ett bredare val av funktionalitet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho discute a implementação da estratégia de postponement, um conceito que vem crescendo de importância nos últimos anos. Postponement é um conceito operacional que consiste em retardar a configuração final de produtos até que os pedidos dos consumidores sejam recebidos. Apesar da atratividade teórica e relevância do tema, pouco ainda se sabe sobre seu processo de implementação, especialmente no ambiente de negócio brasileiro. Este trabalho investigou em profundidade a implementação do postponement em cinco conceituadas empresas no Brasil procurando identificar os motivos que levaram seus respectivos executivos a adotarem tal estratégia, quais foram os agentes facilitadores e os obstáculos à implementação e, finalmente, até que ponto o postponement contribuiu para um aumento da competitividade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Poverty in Brazil has been gradually reduced. Among the main reasons, there are public policies for universalization of rights. On the other hand, the municipalities' Human Development Index indicates scenarios of growing inequality. In other words, some regions, basically of rural character, were left behind in that process of development. In 2008, the “Territórios da Cidadania” (Territories of Citizenship) Program was launched by the federal government, under high expectations. It was proposed to develop those regions and to prioritize the arrival of ongoing federal public policies where they were most demanded. The program has shown an innovative arrangement which included dozens of ministries and other federal agencies, state governments, municipalities and collegialities to the palliative management and control of the territory. In this structure, both new and existing jurisdictions came to support the program coordination. This arrangement was classified as an example of multi-level governance, whose theory has been an efficient instrument to understand the intra- and intergovernmental relations under which the program took place. The program lasted only three years. In Vale do Ribeira Territory – SP, few community leaderships acknowledge it, although not having further information about its actions and effects. Against this background, the approach of this research aims to study the program coordination and governance structure (from Vale Territory, considered as the most local level, until the federal government), based on the hypothesis that, beyond the local contingencies in Vale do Ribeira, the layout and implementation of the Territories of Citizenship Program as they were formulated possess fundamental structural issues that hinder its goals of reducing poverty and inequality through promoting the development of the territory. Complementing the research, its specific goal was to raise the program layout and background in order to understand how the relations, predicted or not in its structure, were formulated and how they were developed, with special attention to Vale do Ribeira-SP. Generally speaking, it was concluded that the coordination and governance arrangement of the Territories of Citizenship Program failed for not having developed qualified solutions to deal with the challenges of the federalist Brazilian structure, party politics, sectorized public actions, or even the territory contingencies and specificities. The complexity of the program, the poverty problem proposed to be faced, and the territorial strategy of development charged a high cost of coordination, which was not accomplished by the proposal of centralization in the federal government with internal decentralization of the coordination. As the presidency changed in 2011, the program could not present results that were able to justify the arguments for its continuation, therefore it was paralyzed, lost its priority status, and the resources previously invested were redirected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

LOPES, Jose Soares Batista et al. Application of multivariable control using artificial neural networks in a debutanizer distillation column.In: INTERNATIONAL CONGRESS OF MECHANICAL ENGINEERING - COBEM, 19, 5-9 nov. 2007, Brasilia. Anais... Brasilia, 2007

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new approach to reduction and abstraction of visual information for robotics vision applications. Basically, we propose to use a multi-resolution representation in combination with a moving fovea for reducing the amount of information from an image. We introduce the mathematical formalization of the moving fovea approach and mapping functions that help to use this model. Two indexes (resolution and cost) are proposed that can be useful to choose the proposed model variables. With this new theoretical approach, it is possible to apply several filters, to calculate disparity and to obtain motion analysis in real time (less than 33ms to process an image pair at a notebook AMD Turion Dual Core 2GHz). As the main result, most of time, the moving fovea allows the robot not to perform physical motion of its robotics devices to keep a possible region of interest visible in both images. We validate the proposed model with experimental results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esse trabalho tem por objetivo o desenvolvimento de um sistema inteligente para detecção da queima no processo de retificação tangencial plana através da utilização de uma rede neural perceptron multi camadas, treinada para generalizar o processo e, conseqüentemente, obter o limiar de queima. em geral, a ocorrência da queima no processo de retificação pode ser detectada pelos parâmetros DPO e FKS. Porém esses parâmetros não são eficientes nas condições de usinagem usadas nesse trabalho. Os sinais de emissão acústica e potência elétrica do motor de acionamento do rebolo são variáveis de entrada e a variável de saída é a ocorrência da queima. No trabalho experimental, foram empregados um tipo de aço (ABNT 1045 temperado) e um tipo de rebolo denominado TARGA, modelo ART 3TG80.3 NVHB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fiber reinforced polymer composites have been widely applied in the aeronautical field. However, composite processing, which uses unlocked molds, should be avoided in view of the tight requirements and also due to possible environmental contamination. To produce high performance structural frames meeting aeronautical reproducibility and low cost criteria, the Brazilian industry has shown interest to investigate the resin transfer molding process (RTM) considering being a closed-mold pressure injection system which allows faster gel and cure times. Due to the fibrous composite anisotropic and non homogeneity characteristics, the fatigue behavior is a complex phenomenon quite different from to metals materials crucial to be investigated considering the aeronautical application. Fatigue sub-scale specimens of intermediate modulus carbon fiber non-crimp multi-axial reinforcement and epoxy mono-component system composite were produced according to the ASTM 3039 D. Axial fatigue tests were carried out according to ASTM D 3479. A sinusoidal load of 10 Hz frequency and load ratio R = 0.1. It was observed a high fatigue interval obtained for NCF/RTM6 composites. Weibull statistical analysis was applied to describe the failure probability of materials under cyclic loads and fractures pattern was observed by scanning electron microscopy. (C) 2010 Published by Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although some individual techniques of supervised Machine Learning (ML), also known as classifiers, or algorithms of classification, to supply solutions that, most of the time, are considered efficient, have experimental results gotten with the use of large sets of pattern and/or that they have a expressive amount of irrelevant data or incomplete characteristic, that show a decrease in the efficiency of the precision of these techniques. In other words, such techniques can t do an recognition of patterns of an efficient form in complex problems. With the intention to get better performance and efficiency of these ML techniques, were thought about the idea to using some types of LM algorithms work jointly, thus origin to the term Multi-Classifier System (MCS). The MCS s presents, as component, different of LM algorithms, called of base classifiers, and realized a combination of results gotten for these algorithms to reach the final result. So that the MCS has a better performance that the base classifiers, the results gotten for each base classifier must present an certain diversity, in other words, a difference between the results gotten for each classifier that compose the system. It can be said that it does not make signification to have MCS s whose base classifiers have identical answers to the sames patterns. Although the MCS s present better results that the individually systems, has always the search to improve the results gotten for this type of system. Aim at this improvement and a better consistency in the results, as well as a larger diversity of the classifiers of a MCS, comes being recently searched methodologies that present as characteristic the use of weights, or confidence values. These weights can describe the importance that certain classifier supplied when associating with each pattern to a determined class. These weights still are used, in associate with the exits of the classifiers, during the process of recognition (use) of the MCS s. Exist different ways of calculating these weights and can be divided in two categories: the static weights and the dynamic weights. The first category of weights is characterizes for not having the modification of its values during the classification process, different it occurs with the second category, where the values suffers modifications during the classification process. In this work an analysis will be made to verify if the use of the weights, statics as much as dynamics, they can increase the perfomance of the MCS s in comparison with the individually systems. Moreover, will be made an analysis in the diversity gotten for the MCS s, for this mode verify if it has some relation between the use of the weights in the MCS s with different levels of diversity

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Biopharmaceutical drugs are mainly recombinant proteins produced by biotechnological tools. The patents of many biopharmaceuticals have expired, and biosimilars are thus currently being developed. Human granulocyte colony stimulating factor (hG-CSF) is a hematopoietic cytokine that acts on cells of the neutrophil lineage causing proliferation and differentiation of committed precursor cells and activation of mature neutrophils. Recombinant hG-CSF has been produced in genetically engineered Escherichia coli ( Filgrastim) and successfully used to treat cancer patients suffering from chemotherapy-induced neutropenia. Filgrastim is a 175 amino acid protein, containing an extra N-terminal methionine, which is needed for expression in E. coli. Here we describe a simple and low-cost process that is amenable to scaling-up for the production and purification of homogeneous and active recombinant hG-CSF expressed in E. coli cells.Results: Here we describe cloning of the human granulocyte colony-stimulating factor coding DNA sequence, protein expression in E. coli BL21(DE3) host cells in the absence of isopropyl-beta-D-thiogalactopyranoside ( IPTG) induction, efficient isolation and solubilization of inclusion bodies by a multi-step washing procedure, and a purification protocol using a single cationic exchange column. Characterization of homogeneous rhG-CSF by size exclusion and reverse phase chromatography showed similar yields to the standard. The immunoassay and N-terminal sequencing confirmed the identity of rhG-CSF. The biological activity assay, in vivo, showed an equivalent biological effect (109.4%) to the standard reference rhG-CSF. The homogeneous rhG-CSF protein yield was 3.2 mg of bioactive protein per liter of cell culture.Conclusion: The recombinant protein expression in the absence of IPTG induction is advantageous since cost is reduced, and the protein purification protocol using a single chromatographic step should reduce cost even further for large scale production. The physicochemical, immunological and biological analyses showed that this protocol can be useful to develop therapeutic bioproducts. In summary, the combination of different experimental strategies presented here allowed an efficient and cost-effective protocol for rhG-CSF production. These data may be of interest to biopharmaceutical companies interested in developing biosimilars and healthcare community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant part of film production by the coating industry is based on wet bench processes, where better understanding of their temporal dynamics could facilitate control and optimization. In this work, in situ laser interferometry is applied to study properties of flowing liquids and quantitatively monitor the dip coating batch process. Two oil standards Newtonian, non-volatile, with constant refractive indices and distinct flow properties - were measured under several withdrawing speeds. The dynamics of film physical thickness then depends on time as t(-1/2), and flow characterization becomes possible with high precision (linear slope uncertainty of +/-0.04%). Resulting kinematic viscosities for OP60 and OP400 are 1,17 +/- 0,03. St and 9,9 +/- 0,2 St, respectively. These results agree with nominal values, as provided by the manufacturer. For more complex films (a multi-component sol-gel Zirconyl Chloride aqueous solution) with a varying refractive index, through a direct polarimetric measurement, allowing also determination of the temporal evolution of physical thickness (uncertainty of +/- 0,007 microns) is also determined during dip coating.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Factorial experiments are widely used in industry to investigate the effects of process factors on quality response variables. Many food processes, for example, are not only subject to variation between days, but also between different times of the day. Removing this variation using blocking factors leads to row-column designs. In this paper, an algorithm is described for constructing factorial row-column designs when the factors are quantitative, and the data are to be analysed by fitting a polynomial model. The row-column designs are constructed using an iterative interchange search, where interchanges that result in an improvement in the weighted mean of the efficiency factors corresponding to the parameters of interest are accepted. Some examples illustrating the performance of the algorithm are given.