1000 resultados para automatic particle picking
Resumo:
One of the next great challenges of cell biology is the determination of the enormous number of protein structures encoded in genomes. In recent years, advances in electron cryo-microscopy and high-resolution single particle analysis have developed to the point where they now provide a methodology for high resolution structure determination. Using this approach, images of randomly oriented single particles are aligned computationally to reconstruct 3-D structures of proteins and even whole viruses. One of the limiting factors in obtaining high-resolution reconstructions is obtaining a large enough representative dataset ($>100,000$ particles). Traditionally particles have been manually picked which is an extremely labour intensive process. The problem is made especially difficult by the low signal-to-noise ratio of the images. This paper describes the development of automatic particle picking software, which has been tested with both negatively stained and cryo-electron micrographs. This algorithm has been shown to be capable of selecting most of the particles, with few false positives. Further work will involve extending the software to detect differently shaped and oriented particles.
Resumo:
Single particle analysis (SPA) coupled with high-resolution electron cryo-microscopy is emerging as a powerful technique for the structure determination of membrane protein complexes and soluble macromolecular assemblies. Current estimates suggest that ∼104–105 particle projections are required to attain a 3 Å resolution 3D reconstruction (symmetry dependent). Selecting this number of molecular projections differing in size, shape and symmetry is a rate-limiting step for the automation of 3D image reconstruction. Here, we present SwarmPS, a feature rich GUI based software package to manage large scale, semi-automated particle picking projects. The software provides cross-correlation and edge-detection algorithms. Algorithm-specific parameters are transparently and automatically determined through user interaction with the image, rather than by trial and error. Other features include multiple image handling (∼102), local and global particle selection options, interactive image freezing, automatic particle centering, and full manual override to correct false positives and negatives. SwarmPS is user friendly, flexible, extensible, fast, and capable of exporting boxed out projection images, or particle coordinates, compatible with downstream image processing suites.
Resumo:
Site 1103 was one of a transect of three sites drilled across the Antarctic Peninsula continental shelf during Leg 178. The aim of drilling on the shelf was to determine the age of the sedimentary sequences and to ground truth previous interpretations of the depositional environment (i.e., topsets and foresets) of progradational seismostratigraphic sequences S1, S2, S3, and S4. The ultimate objective was to obtain a better understanding of the history of glacial advances and retreats in this west Antarctic margin. Drilling the topsets of the progradational wedge (0-247 m below seafloor [mbsf]), which consist of unsorted and unconsolidated materials of seismic Unit S1, was very unfavorable, resulting in very low (2.3%) core recovery. Recovery improved (34%) below 247 mbsf, corresponding to sediments of seismic Unit S3, which have a consolidated matrix. Logs were only obtained from the interval between 75 and 244 mbsf, and inconsistencies on the automatic analog picking of the signals received from the sonic log at the array and at the two other receivers prevented accurate shipboard time-depth conversions. This, in turn, limited the capacity for making seismic stratigraphic interpretations at this site and regionally. This study is an attempt to compile all available data sources, perform quality checks, and introduce nonstandard processing techniques for the logging data obtained to arrive at a reliable and continuous depth vs. velocity profile. We defined 13 data categories using differential traveltime information. Polynomial exclusion techniques with various orders and low-pass filtering reduced the noise of the initial data pool and produced a definite velocity depth profile that is synchronous with the resistivity logging data. A comparison of the velocity profile produced with various other logs of Site 1103 further validates the presented data. All major logging units are expressed within the new velocity data. A depth-migrated section with the new velocity data is presented together with the original time section and initial depth estimates published within the Leg 178 Initial Reports volume. The presented data confirms the location of the shelf unconformity at 222 ms two-way traveltime (TWT), or 243 mbsf, and allows its seismic identification as a strong negative and subsequent positive reflection.
Resumo:
The 3D reconstruction of a Golgi-stained dendritic tree from a serial stack of images captured with a transmitted light bright-field microscope is investigated. Modifications to the bootstrap filter are discussed such that the tree structure may be estimated recursively as a series of connected segments. The tracking performance of the bootstrap particle filter is compared against Differential Evolution, an evolutionary global optimisation method, both in terms of robustness and accuracy. It is found that the particle filtering approach is significantly more robust and accurate for the data considered.
Resumo:
Freeway systems are becoming more congested each day. One contribution to freeway traffic congestion comprises platoons of on-ramp traffic merging into freeway mainlines. As a relatively low-cost countermeasure to the problem, ramp meters are being deployed in both directions of an 11-mile section of I-95 in Miami-Dade County, Florida. The local Fuzzy Logic (FL) ramp metering algorithm implemented in Seattle, Washington, has been selected for deployment. The FL ramp metering algorithm is powered by the Fuzzy Logic Controller (FLC). The FLC depends on a series of parameters that can significantly alter the behavior of the controller, thus affecting the performance of ramp meters. However, the most suitable values for these parameters are often difficult to determine, as they vary with current traffic conditions. Thus, for optimum performance, the parameter values must be fine-tuned. This research presents a new method of fine tuning the FLC parameters using Particle Swarm Optimization (PSO). PSO attempts to optimize several important parameters of the FLC. The objective function of the optimization model incorporates the METANET macroscopic traffic flow model to minimize delay time, subject to the constraints of reasonable ranges of ramp metering rates and FLC parameters. To further improve the performance, a short-term traffic forecasting module using a discrete Kalman filter was incorporated to predict the downstream freeway mainline occupancy. This helps to detect the presence of downstream bottlenecks. The CORSIM microscopic simulation model was selected as the platform to evaluate the performance of the proposed PSO tuning strategy. The ramp-metering algorithm incorporating the tuning strategy was implemented using CORSIM's run-time extension (RTE) and was tested on the aforementioned I-95 corridor. The performance of the FLC with PSO tuning was compared with the performance of the existing FLC without PSO tuning. The results show that the FLC with PSO tuning outperforms the existing FL metering, fixed-time metering, and existing conditions without metering in terms of total travel time savings, average speed, and system-wide throughput.
Resumo:
This paper introduces a new method to automate the detection of marine species in aerial imagery using a Machine Learning approach. Our proposed system has at its core, a convolutional neural network. We compare this trainable classifier to a handcrafted classifier based on color features, entropy and shape analysis. Experiments demonstrate that the convolutional neural network outperforms the handcrafted solution. We also introduce a negative training example-selection method for situations where the original training set consists of a collection of labeled images in which the objects of interest (positive examples) have been marked by a bounding box. We show that picking random rectangles from the background is not necessarily the best way to generate useful negative examples with respect to learning.
Resumo:
Ultra-fine particle of Ni-B amorphous alloy was prepared by chemical reduction of Ni2+ with NaBH4 and characterized with TEM and XRD. The heat capacity and thermal stability were measured with a high-precision automatic adiabatic calorimeter and DTA. The upper limit of applied temperature of the substance was found to be 684 K for use as catalyst. (C) 1999 Elsevier Science B.V. All rights reserved.
Resumo:
Features analysis is an important task which can significantly affect the performance of automatic bacteria colony picking. Unstructured environments also affect the automatic colony screening. This paper presents a novel approach for adaptive colony segmentation in unstructured environments by treating the detected peaks of intensity histograms as a morphological feature of images. In order to avoid disturbing peaks, an entropy based mean shift filter is introduced to smooth images as a preprocessing step. The relevance and importance of these features can be determined in an improved support vector machine classifier using unascertained least square estimation. Experimental results show that the proposed unascertained least square support vector machine (ULSSVM) has better recognition accuracy than the other state-of-the-art techniques, and its training process takes less time than most of the traditional approaches presented in this paper.
Resumo:
Os incêndios florestais são uma importante fonte de emissão de compostos gasosos e de aerossóis. Em Portugal, onde a maioria dos incêndios ocorre no norte e centro do país, os incêndios destroem todos os anos milhares de hectares, com importantes perdas em termos económicos, de vidas humanas e qualidade ambiental. As emissões podem alterar consideravelmente a química da atmosfera, degradar a qualidade do ar e alterar o clima. Contudo, a informação sobre as caraterísticas das emissões dos incêndios florestais nos países do Mediterrâneo é limitada. Tanto a nível nacional como internacional, existe um interesse crescente na elaboração de inventários de emissões e de regulamentos sobre as emissões de carbono para a atmosfera. Do ponto de vista atmosférico da monitorização atmosférica, os incêndios são considerados um desafio, dada a sua variabilidade temporal e espacial, sendo de esperar um aumento da sua frequência, dimensão e severidade, e também porque as estimativas de emissões dependem das caraterísticas dos biocombustíveis e da fase de combustão. O objetivo deste estudo foi quantificar e caraterizar as emissões de gases e aerossóis de alguns dos mais representativos incêndios florestais que ocorreram no centro de Portugal nos verões de 2009 e de 2010. Efetuou-se a colheita de amostras de gases e de duas frações de partículas (PM2.5 e PM2.5-10) nas plumas de fumo em sacos Tedlar e em filtros de quartzo acoplados a um amostrador de elevado volume, respetivamente. Os hidrocarbonetos totais (THC) e óxidos de carbono (CO e CO2) nas amostras gasosas foram analisados em instrumentos automáticos de ionização de chama e detetores não dispersivos de infravermelhos, respetivamente. Para algumas amostras, foram também quantificados alguns compostos de carbonilo após reamostragem do gás dos sacos Tedlar em cartuchos de sílica gel revestidos com 2,4-dinitrofenilhidrazina (DNPH), seguida de análise por cromatografia líquida de alta resolução. Nas partículas, analisou-se o carbono orgânico e elementar (técnica termo-óptica), iões solúveis em água (cromatografia iónica) e elementos (espectrometria de massa com plasma acoplado por indução ou análise instrumental por ativação com neutrões). A especiação orgânica foi obtida por cromatografia gasosa acoplada a espectrometria de massa após extração com recurso a vários solventes e separação dos extratos orgânicos em diversas classes de diferentes polaridades através do fracionamento com sílica gel. Os fatores de emissão do CO e do CO2 situaram-se nas gamas 52-482 e 822-1690 g kg-1 (base seca), mostrando, respetivamente, correlação negativa e positiva com a eficiência de combustão. Os fatores de emissão dos THC apresentaram valores mais elevados durante a fase de combustão latente sem chama, oscilando entre 0.33 e 334 g kg-1 (base seca). O composto orgânico volátil oxigenado mais abundante foi o acetaldeído com fatores de emissão que variaram desde 1.0 até 3.2 g kg-1 (base seca), seguido pelo formaldeído e o propionaldeído. Observou-se que as emissões destes compostos são promovidas durante a fase de combustão latente sem chama. Os fatores de emissão de PM2.5 e PM10 registaram valores entre 0.50-68 e 0.86-72 g kg-1 (base seca), respetivamente. A emissão de partículas finas e grosseiras é também promovida em condições de combustão lenta. As PM2.5 representaram cerca de 90% da massa de partículas PM10. A fração carbonosa das partículas amostradas em qualquer dos incêndios foi claramente dominada pelo carbono orgânico. Foi obtida uma ampla gama de rácios entre o carbono orgânico e o carbono elementar, dependendo das condições de combustão. Contudo, todos os rácios refletiram uma maior proporção de carbono orgânico em relação ao carbono elementar, típica das emissões de queima de biomassa. Os iões solúveis em água obtidos nas partículas da pluma de fumo contribuíram com valores até 3.9% da massa de partículas PM2.5 e 2.8% da massa de partículas de PM2.5-10. O potássio contribuiu com valores até 15 g mg-1 PM2.5 e 22 g mg-1 PM2.5-10, embora em massa absoluta estivesse maioritariamente presente nas partículas finas. Os rácios entre potássio e carbono elementar e entre potássio e carbono orgânico obtidos nas partículas da pluma de fumo enquadram-se na gama de valores relatados na literatura para emissões de queima de biomassa. Os elementos detetados nas amostras representaram, em média, valores até 1.2% e 12% da massa de PM2.5 e PM2.5-10, respetivamente. Partículas resultantes de uma combustão mais completa (valores elevados de CO2 e baixos de CO) foram caraterizadas por um elevado teor de constituintes inorgânicos e um menor conteúdo de matéria orgânica. Observou-se que a matéria orgânica particulada é composta principalmente por componentes fenólicos e produtos derivados, séries de compostos homólogos (alcanos, alcenos, ácidos alcanóicos e alcanóis), açúcares, biomarcadores esteróides e terpenóides, e hidrocarbonetos aromáticos policíclicos. O reteno, um biomarcador das emissões da queima de coníferas, foi o hidrocarboneto aromático dominante nas amostras das plumas de fumo amostradas durante a campanha que decorreu em 2009, devido ao predomínio de amostras colhidas em incêndios em florestas de pinheiros. O principal açúcar anidro, e sempre um dos compostos mais abundantes, foi o levoglucosano. O rácio levoglucosano/OC obtido nas partículas das plumas de fumo, em média, registaram valores desde 5.8 a 23 mg g-1 OC. Os rácios levoglucosano/manosano e levoglucosano/(manosano+galactosano) revelaram o predomínio de amostras provenientes da queima de coníferas. Tendo em conta que a estimativa das emissões dos incêndios florestais requer um conhecimento de fatores de emissão apropriados para cada biocombustível, a base de dados abrangente obtida neste estudo é potencialmente útil para atualizar os inventários de emissões. Tem vindo a ser observado que a fase de combustão latente sem chama, a qual pode ocorrer simultaneamente com a fase de chama e durar várias horas ou dias, pode contribuir para uma quantidade considerável de poluentes atmosféricos, pelo que os fatores de emissão correspondentes devem ser considerados no cálculo das emissões globais de incêndios florestais. Devido à falta de informação detalhada sobre perfis químicos de emissão, a base de dados obtida neste estudo pode também ser útil para a aplicação de modelos no recetor no sul da Europa.
Resumo:
This paper describes a method of identifying morphological attributes that classify wear particles in relation to the wear process from which they originate and permit the automatic identification without human expertise. The method is based on the use of Multi Layer Perceptron (MLP) for analysis of specific types of microscopic wear particles. The classification of the wear particles was performed according to their morphological attributes of size and aspect ratio, among others. (C) 2010 Journal of Mechanical Engineering. All rights reserved.
Resumo:
Neste trabalho, estuda-se um novo método de inversão tomográfica de reflexão para a determinação de um modelo isotrópico e suave de velocidade por meio da aplicação, em dados sintéticos e reais, do programa Niptomo que é uma implementação do método de inversão tomográfica dos atributos cinemáticos da onda hipotética do ponto de incidência normal (PIN). Os dados de entrada para a inversão tomográfica, isto é, o tempo de trânsito e os atributos da onda PIN (raio de curvatura da frente de onda emergente e ângulo de emergência), são retirados de uma série de pontos escolhidos na seção afastamento nulo (AN) simulada, obtida pelo método de empilhamento por superfícies de reflexão comum (SRC). Normalmente, a escolha destes pontos na seção AN é realizada utilizando-se programas de picking automático, que identificam eventos localmente coerentes na seção sísmica com base nos parâmetros fornecidos pelo usuário. O picking é um dos processos mais críticos dos métodos de inversão tomográfica, pois a inclusão de dados de eventos que não sejam de reflexões primárias podem ser incluídos neste processo, prejudicando assim o modelo de velocidades a ser obtido pela inversão tomográfica. Este trabalho tem por objetivo de construir um programa de picking interativo para fornecer ao usuário o controle da escolha dos pontos de reflexões sísmicas primárias, cujos dados serão utilizados na inversão tomográfica. Os processos de picking e inversão tomográfica são aplicados nos dados sintéticos Marmousi e nos dados da linha sísmica 50-RL-90 da Bacia do Tacutu. Os resultados obtidos mostraram que o picking interativo para a escolha de pontos sobre eventos de reflexões primárias favorece na obtenção de um modelo de velocidade mais preciso.
Resumo:
The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.