959 resultados para CMF, molecular cloud, extraction algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work belongs to the PRANA project, the first extensive field campaign of observation of atmospheric emission spectra covering the Far InfraRed spectral region, for more than two years. The principal deployed instrument is REFIR-PAD, a Fourier transform spectrometer used by us to study Antarctic cloud properties. A dataset covering the whole 2013 has been analyzed and, firstly, a selection of good quality spectra is performed, using, as thresholds, radiance values in few chosen spectral regions. These spectra are described in a synthetic way averaging radiances in selected intervals, converting them into BTs and finally considering the differences between each pair of them. A supervised feature selection algorithm is implemented with the purpose to select the features really informative about the presence, the phase and the type of cloud. Hence, training and test sets are collected, by means of Lidar quick-looks. The supervised classification step of the overall monthly datasets is performed using a SVM. On the base of this classification and with the help of Lidar observations, 29 non-precipitating ice cloud case studies are selected. A single spectrum, or at most an average over two or three spectra, is processed by means of the retrieval algorithm RT-RET, exploiting some main IR window channels, in order to extract cloud properties. Retrieved effective radii and optical depths are analyzed, to compare them with literature studies and to evaluate possible seasonal trends. Finally, retrieval output atmospheric profiles are used as inputs for simulations, assuming two different crystal habits, with the aim to examine our ability to reproduce radiances in the FIR. Substantial mis-estimations are found for FIR micro-windows: a high variability is observed in the spectral pattern of simulation deviations from measured spectra and an effort to link these deviations to cloud parameters has been performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die vorliegende Arbeit behandelt die Entwicklung und Verbesserung von linear skalierenden Algorithmen für Elektronenstruktur basierte Molekulardynamik. Molekulardynamik ist eine Methode zur Computersimulation des komplexen Zusammenspiels zwischen Atomen und Molekülen bei endlicher Temperatur. Ein entscheidender Vorteil dieser Methode ist ihre hohe Genauigkeit und Vorhersagekraft. Allerdings verhindert der Rechenaufwand, welcher grundsätzlich kubisch mit der Anzahl der Atome skaliert, die Anwendung auf große Systeme und lange Zeitskalen. Ausgehend von einem neuen Formalismus, basierend auf dem großkanonischen Potential und einer Faktorisierung der Dichtematrix, wird die Diagonalisierung der entsprechenden Hamiltonmatrix vermieden. Dieser nutzt aus, dass die Hamilton- und die Dichtematrix aufgrund von Lokalisierung dünn besetzt sind. Das reduziert den Rechenaufwand so, dass er linear mit der Systemgröße skaliert. Um seine Effizienz zu demonstrieren, wird der daraus entstehende Algorithmus auf ein System mit flüssigem Methan angewandt, das extremem Druck (etwa 100 GPa) und extremer Temperatur (2000 - 8000 K) ausgesetzt ist. In der Simulation dissoziiert Methan bei Temperaturen oberhalb von 4000 K. Die Bildung von sp²-gebundenem polymerischen Kohlenstoff wird beobachtet. Die Simulationen liefern keinen Hinweis auf die Entstehung von Diamant und wirken sich daher auf die bisherigen Planetenmodelle von Neptun und Uranus aus. Da das Umgehen der Diagonalisierung der Hamiltonmatrix die Inversion von Matrizen mit sich bringt, wird zusätzlich das Problem behandelt, eine (inverse) p-te Wurzel einer gegebenen Matrix zu berechnen. Dies resultiert in einer neuen Formel für symmetrisch positiv definite Matrizen. Sie verallgemeinert die Newton-Schulz Iteration, Altmans Formel für beschränkte und nicht singuläre Operatoren und Newtons Methode zur Berechnung von Nullstellen von Funktionen. Der Nachweis wird erbracht, dass die Konvergenzordnung immer mindestens quadratisch ist und adaptives Anpassen eines Parameters q in allen Fällen zu besseren Ergebnissen führt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is aimed to assess similarities and mismatches between the outputs from two independent methods for the cloud cover quantification and classification based on quite different physical basis. One of them is the SAFNWC software package designed to process radiance data acquired by the SEVIRI sensor in the VIS/IR. The other is the MWCC algorithm, which uses the brightness temperatures acquired by the AMSU-B and MHS sensors in their channels centered in the MW water vapour absorption band. At a first stage their cloud detection capability has been tested, by comparing the Cloud Masks they produced. These showed a good agreement between two methods, although some critical situations stand out. The MWCC, in effect, fails to reveal clouds which according to SAFNWC are fractional, cirrus, very low and high opaque clouds. In the second stage of the inter-comparison the pixels classified as cloudy according to both softwares have been. The overall observed tendency of the MWCC method, is an overestimation of the lower cloud classes. Viceversa, the more the cloud top height grows up, the more the MWCC not reveal a certain cloud portion, rather detected by means of the SAFNWC tool. This is what also emerges from a series of tests carried out by using the cloud top height information in order to evaluate the height ranges in which each MWCC category is defined. Therefore, although the involved methods intend to provide the same kind of information, in reality they return quite different details on the same atmospheric column. The SAFNWC retrieval being very sensitive to the top temperature of a cloud, brings the actual level reached by this. The MWCC, by exploiting the capability of the microwaves, is able to give an information about the levels that are located more deeply within the atmospheric column.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface based measurements systems play a key role in defining the ground truth for climate modeling and satellite product validation. The Italian-French station of Concordia is operative year round since 2005 at Dome C (75°S, 123°E, 3230 m) on the East Antarctic Plateau. A Baseline Surface Radiation Network (BSRN) site was deployed and became operational since January 2006 to measure downwelling components of the radiation budget, and successively was expanded in April 2007 to measure upwelling radiation. Hence, almost a decade of measurement is now available and suitable to define a statistically significant climatology for the radiation budget of Concordia including eventual trends, by specifically assessing the effects of clouds and water vapor on SW and LW net radiation. A well known and robust clear sky-id algorithm (Long and Ackerman, 2000) has been operationally applied on downwelling SW components to identify cloud free events and to fit a parametric equation to determine clear-sky reference along the Antarctic daylight periods (September to April). A new model for surface broadband albedo has been developed in order to better describe the features the area. Then, a novel clear-sky LW parametrization, based on a-priori assumption about inversion layer structure, combined with daily and annual oscillations of the surface temperature, have been adopted and validated. The longwave based method is successively exploited to extend cloud radiative forcing studies to nighttime period (winter). Results indicated inter-annual and intra-annual warming behaviour, i.e. 13.70 W/m2 on the average, specifically approaching neutral effect in summer, when SW CRF compensates LW CRF, and warming along the rest of the year due prevalentely to CRF induced on the LW component.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In most pathology laboratories worldwide, formalin-fixed paraffin embedded (FFPE) samples are the only tissue specimens available for routine diagnostics. Although commercial kits for diagnostic molecular pathology testing are becoming available, most of the current diagnostic tests are laboratory-based assays. Thus, there is a need for standardized procedures in molecular pathology, starting from the extraction of nucleic acids. To evaluate the current methods for extracting nucleic acids from FFPE tissues, 13 European laboratories, participating to the European FP6 program IMPACTS (www.impactsnetwork.eu), isolated nucleic acids from four diagnostic FFPE tissues using their routine methods, followed by quality assessment. The DNA-extraction protocols ranged from homemade protocols to commercial kits. Except for one homemade protocol, the majority gave comparable results in terms of the quality of the extracted DNA measured by the ability to amplify differently sized control gene fragments by PCR. For array-applications or tests that require an accurately determined DNA-input, we recommend using silica based adsorption columns for DNA recovery. For RNA extractions, the best results were obtained using chromatography column based commercial kits, which resulted in the highest quantity and best assayable RNA. Quality testing using RT-PCR gave successful amplification of 200 bp-250 bp PCR products from most tested tissues. Modifications of the proteinase-K digestion time led to better results, even when commercial kits were applied. The results of the study emphasize the need for quality control of the nucleic acid extracts with standardised methods to prevent false negative results and to allow data comparison among different diagnostic laboratories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic identification and extraction of bone contours from X-ray images is an essential first step task for further medical image analysis. In this paper we propose a 3D statistical model based framework for the proximal femur contour extraction from calibrated X-ray images. The automatic initialization is solved by an estimation of Bayesian network algorithm to fit a multiple component geometrical model to the X-ray data. The contour extraction is accomplished by a non-rigid 2D/3D registration between a 3D statistical model and the X-ray images, in which bone contours are extracted by a graphical model based Bayesian inference. Preliminary experiments on clinical data sets verified its validity

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite measurement validations, climate models, atmospheric radiative transfer models and cloud models, all depend on accurate measurements of cloud particle size distributions, number densities, spatial distributions, and other parameters relevant to cloud microphysical processes. And many airborne instruments designed to measure size distributions and concentrations of cloud particles have large uncertainties in measuring number densities and size distributions of small ice crystals. HOLODEC (Holographic Detector for Clouds) is a new instrument that does not have many of these uncertainties and makes possible measurements that other probes have never made. The advantages of HOLODEC are inherent to the holographic method. In this dissertation, I describe HOLODEC, its in-situ measurements of cloud particles, and the results of its test flights. I present a hologram reconstruction algorithm that has a sample spacing that does not vary with reconstruction distance. This reconstruction algorithm accurately reconstructs the field to all distances inside a typical holographic measurement volume as proven by comparison with analytical solutions to the Huygens-Fresnel diffraction integral. It is fast to compute, and has diffraction limited resolution. Further, described herein is an algorithm that can find the position along the optical axis of small particles as well as large complex-shaped particles. I explain an implementation of these algorithms that is an efficient, robust, automated program that allows us to process holograms on a computer cluster in a reasonable time. I show size distributions and number densities of cloud particles, and show that they are within the uncertainty of independent measurements made with another measurement method. The feasibility of another cloud particle instrument that has advantages over new standard instruments is proven. These advantages include a unique ability to detect shattered particles using three-dimensional positions, and a sample volume size that does not vary with particle size or airspeed. It also is able to yield two-dimensional particle profiles using the same measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, I study skin lesion detection and its applications to skin cancer diagnosis. A skin lesion detection algorithm is proposed. The proposed algorithm is based color information and threshold. For the proposed algorithm, several color spaces are studied and the detection results are compared. Experimental results show that YUV color space can achieve the best performance. Besides, I develop a distance histogram based threshold selection method and the method is proven to be better than other adaptive threshold selection methods for color detection. Besides the detection algorithms, I also investigate GPU speed-up techniques for skin lesion extraction and the results show that GPU has potential applications in speeding-up skin lesion extraction. Based on the skin lesion detection algorithms proposed, I developed a mobile-based skin cancer diagnosis application. In this application, the user with an iPhone installed with the proposed application can use the iPhone as a diagnosis tool to find the potential skin lesions in a persons' skin and compare the skin lesions detected by the iPhone with the skin lesions stored in a database in a remote server.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, advanced metering infrastructure (AMI) has been the main research focus due to the traditional power grid has been restricted to meet development requirements. There has been an ongoing effort to increase the number of AMI devices that provide real-time data readings to improve system observability. Deployed AMI across distribution secondary networks provides load and consumption information for individual households which can improve grid management. Significant upgrade costs associated with retrofitting existing meters with network-capable sensing can be made more economical by using image processing methods to extract usage information from images of the existing meters. This thesis presents a new solution that uses online data exchange of power consumption information to a cloud server without modifying the existing electromechanical analog meters. In this framework, application of a systematic approach to extract energy data from images replaces the manual reading process. One case study illustrates the digital imaging approach is compared to the averages determined by visual readings over a one-month period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular diagnosis of canine bartonellosis can be extremely challenging and often requires the use of an enrichment culture approach followed by PCR amplification of bacterial DNA. HYPOTHESES: (1) The use of enrichment culture with PCR will increase molecular detection of bacteremia and will expand the diversity of Bartonella species detected. (2) Serological testing for Bartonella henselae and Bartonella vinsonii subsp. berkhoffii does not correlate with documentation of bacteremia. ANIMALS: Between 2003 and 2009, 924 samples from 663 dogs were submitted to the North Carolina State University, College of Veterinary Medicine, Vector Borne Diseases Diagnostic Laboratory for diagnostic testing with the Bartonella α-Proteobacteria growth medium (BAPGM) platform. Test results and medical records of those dogs were retrospectively reviewed. METHODS: PCR amplification of Bartonella sp. DNA after extraction from patient samples was compared with PCR after BAPGM enrichment culture. Indirect immunofluorescent antibody assays, used to detect B. henselae and B. vinsonii subsp. berkhoffii antibodies, were compared with PCR. RESULTS: Sixty-one of 663 dogs were culture positive or had Bartonella DNA detected by PCR, including B. henselae (30/61), B. vinsonii subsp. berkhoffii (17/61), Bartonella koehlerae (7/61), Bartonella volans-like (2/61), and Bartonella bovis (2/61). Coinfection with more than 1 Bartonella sp. was documented in 9/61 dogs. BAPGM culture was required for PCR detection in 32/61 cases. Only 7/19 and 4/10 infected dogs tested by IFA were B. henselae and B. vinsonii subsp. berkhoffii seroreactive, respectively. CONCLUSIONS AND CLINICAL IMPORTANCE: Dogs were most often infected with B. henselae or B. vinsonii subsp. berkhoffii based on PCR and enrichment culture, coinfection was documented, and various Bartonella species were identified. Most infected dogs did not have detectable Bartonella antibodies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The two major subtypes of diffuse large B-cell lymphoma (DLBCL) (germinal centre B-cell - like (GCB-DLBCL) and activated B-cell - like (ABC-DLBCL)) are defined by means of gene expression profiling (GEP). Patients with GCB-DLBCL survive longer with the current standard regimen R-CHOP than patients with ABC-DLBCL. As GEP is not part of the current routine diagnostic work-up, efforts have been made to find a substitute than involves immunohistochemistry (IHC). Various algorithms achieved this with 80-90% accuracy. However, conflicting results on the appropriateness of IHC have been reported. Because it is likely that the molecular subtypes will play a role in future clinical practice, we assessed the determination of the molecular DLBCL subtypes by means of IHC at our University Hospital, and some aspects of this determination elsewhere in Switzerland. The most frequently used Hans algorithm includes three antibodies (against CD10, bcl-6 and MUM1). From records of the routine diagnostic work-up, we identified 51 of 172 (29.7%) newly diagnosed and treated DLBCL cases from 2005 until 2010 with an assigned DLBCL subtype. DLBCL subtype information was expanded by means of tissue microarray analysis. The outcome for patients with the GCB subtype was significantly better compared with those with the non-GC subtype, independent of the age-adjusted International Prognostic Index. We found a lack of standardisation in the subtype determination by means of IHC in Switzerland and significant problems of reproducibility. We conclude that the Hans algorithm performs well in our hands and that awareness of this important matter is increasing. However, outside clinical trials, vigorous efforts to standardise IHC determination are needed as DLBCL subtype-specific therapies emerge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

(1) A mathematical theory for computing the probabilities of various nucleotide configurations is developed, and the probability of obtaining the correct phylogenetic tree (model tree) from sequence data is evaluated for six phylogenetic tree-making methods (UPGMA, distance Wagner method, transformed distance method, Fitch-Margoliash's method, maximum parsimony method, and compatibility method). The number of nucleotides (m*) necessary to obtain the correct tree with a probability of 95% is estimated with special reference to the human, chimpanzee, and gorilla divergence. m* is at least 4,200, but the availability of outgroup species greatly reduces m* for all methods except UPGMA. m* increases if transitions occur more frequently than transversions as in the case of mitochondrial DNA. (2) A new tree-making method called the neighbor-joining method is proposed. This method is applicable either for distance data or character state data. Computer simulation has shown that the neighbor-joining method is generally better than UPGMA, Farris' method, Li's method, and modified Farris method on recovering the true topology when distance data are used. A related method, the simultaneous partitioning method, is also discussed. (3) The maximum likelihood (ML) method for phylogeny reconstruction under the assumption of both constant and varying evolutionary rates is studied, and a new algorithm for obtaining the ML tree is presented. This method gives a tree similar to that obtained by UPGMA when constant evolutionary rate is assumed, whereas it gives a tree similar to that obtained by the maximum parsimony tree and the neighbor-joining method when varying evolutionary rate is assumed. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 cloud masks, SYNOP (surface synoptic observations) weather reports, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) vertical feature mask version 3 and to MODIS collection 5 snow mask. The outcomes of conducted analyses proved fine detection skills of the PCM method with results comparable to or better than the reference PPS algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim. This study was focused on (i) detection of specific BVDV-antibodies within selected cattle farms, (ii) identification of persistently infected (PI) animals and (iii) genetic typing of selected BVDV isolates. Methods. RNA extraction, real-time polymerase chain reaction, ELISA technique, sequencing. Results. Specific BVDV-antibodies were detected in 713 of 1,059 analyzed samples (67.3 per cent). This number is in agreement with findings in many cattle herds around the world. However, the number of positive samples differed in the herds. While 57 samples out of 283 (20.1 per cent) were identified in the first herd, 400 out of 475 (84.2 per cent) and 256 out of 301 (85 per cent) animals were positive in the second and third herd. High number of animals with BVDV RNA was detected in all herds. The real-time PCR assay detected BVDV RNA in 5 of 1068 samples analyzed (0.5 per cent). 4 positive samples out of 490 (0.8 per cent) and 1 out of 301 (0.33 per cent) were found in the second and third herd. The genetic materials of BVDV were not found in the first herd. Data on the number of PI animals were in accord with serological findings in the cattle herds involved in our study. The genetic typing of viral isolates revealed that only BVDV, Type 1 viruses were present. The hylogenetic analysis confirmed two BVDV-1 subtypes, namely b and f and revealed that all 4 viruses from the second farm were typed as BVDV-1b and all of them were absolutely identical in 5’-UTR, but virus from the third farm was typed as BVDV-1f. Conclusion. Our results indicated that the BVDV infection is widespread in cattle herds in the eastern Ukraine, that requires further research and development of new approaches to improve the current situation.