970 resultados para 650200 Mining and Extraction


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Internal quantum efficiency (IQE) of a high-brightness blue LED has been evaluated from the external quantum efficiency measured as a function of current at room temperature. Processing the data with a novel evaluation procedure based on the ABC-model, we have determined separately IQE of the LED structure and light extraction efficiency (LEE) of UX:3 chip. Full text Nowadays, understanding of LED efficiency behavior at high currents is quite critical to find ways for further improve­ment of III-nitride LED performance [1]. External quantum ef­ficiency ηe (EQE) provides integral information on the recom­bination and photon emission processes in LEDs. Meanwhile EQE is the product of IQE ηi and LEE ηext at negligible car­rier leakage from the active region. Separate determination of IQE and LEE would be much more helpful, providing correla­tion between these parameters and specific epi-structure and chip design. In this paper, we extend the approach of [2,3] to the whole range of the current/optical power variation, provid­ing an express tool for separate evaluation of IQE and LEE. We studied an InGaN-based LED fabricated by Osram OS. LED structure grown by MOCVD on sapphire substrate was processed as UX:3 chip and mounted into the Golden Dragon package without molding. EQE was measured with Labsphere CDS-600 spectrometer. Plotting EQE versus output power P and finding the power Pm corresponding to EQE maximum ηm enables comparing the measurements with the analytical rela­tionships ηi = Q/(Q+p1/2+p-1/2) ,p = P/Pm , and Q = B/(AC) 1/2 where A, Band C are recombination constants [4]. As a result, maximum IQE value equal to QI(Q+2) can be found from the ratio ηm/ηe plotted as a function of p1/2 +p1-1/2 (see Fig.la) and then LEE calculated as ηext = ηm (Q+2)/Q . Experimental EQE as a function of normalized optical power p is shown in Fig. 1 b along with the analytical approximation based on the ABC­model. The approximation fits perfectly the measurements in the range of the optical power (or operating current) variation by eight orders of magnitude. In conclusion, new express method for separate evaluation of IQE and LEE of III-nitride LEDs is suggested and applied to characterization of a high-brightness blue LED. With this method, we obtained LEE from the free chip surface to the air as 69.8% and IQE as 85.7% at the maximum and 65.2% at the operation current 350 rnA. [I] G. Verzellesi, D. Saguatti, M. Meneghini, F. Bertazzi, M. Goano, G. Meneghesso, and E. Zanoni, "Efficiency droop in InGaN/GaN blue light-emitting diodes: Physical mechanisms and remedies," 1. AppL Phys., vol. 114, no. 7, pp. 071101, Aug., 2013. [2] C. van Opdorp and G. W. 't Hooft, "Method for determining effective non radiative lifetime and leakage losses in double-heterostructure las­ers," 1. AppL Phys., vol. 52, no. 6, pp. 3827-3839, Feb., 1981. [3] M. Meneghini, N. Trivellin, G. Meneghesso, E. Zanoni, U. Zehnder, and B. Hahn, "A combined electro-optical method for the determination of the recombination parameters in InGaN-based light-emitting diodes," 1. AppL Phys., vol. 106, no. II, pp. 114508, Dec., 2009. [4] Qi Dai, Qifeng Shan, ling Wang, S. Chhajed, laehee Cho, E. F. Schubert, M. H. Crawford, D. D. Koleske, Min-Ho Kim, and Yongjo Park, "Carrier recombination mechanisms and efficiency droop in GalnN/GaN light-emitting diodes," App/. Phys. Leu., vol. 97, no. 13, pp. 133507, Sept., 2010. © 2014 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software product line modeling aims at capturing a set of software products in an economic yet meaningful way. We introduce a class of variability models that capture the sharing between the software artifacts forming the products of a software product line (SPL) in a hierarchical fashion, in terms of commonalities and orthogonalities. Such models are useful when analyzing and verifying all products of an SPL, since they provide a scheme for divide-and-conquer-style decomposition of the analysis or verification problem at hand. We define an abstract class of SPLs for which variability models can be constructed that are optimal w.r.t. the chosen representation of sharing. We show how the constructed models can be fed into a previously developed algorithmic technique for compositional verification of control-flow temporal safety properties, so that the properties to be verified are iteratively decomposed into simpler ones over orthogonal parts of the SPL, and are not re-verified over the shared parts. We provide tool support for our technique, and evaluate our tool on a small but realistic SPL of cash desks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the explosive growth of the volume and complexity of document data (e.g., news, blogs, web pages), it has become a necessity to semantically understand documents and deliver meaningful information to users. Areas dealing with these problems are crossing data mining, information retrieval, and machine learning. For example, document clustering and summarization are two fundamental techniques for understanding document data and have attracted much attention in recent years. Given a collection of documents, document clustering aims to partition them into different groups to provide efficient document browsing and navigation mechanisms. One unrevealed area in document clustering is that how to generate meaningful interpretation for the each document cluster resulted from the clustering process. Document summarization is another effective technique for document understanding, which generates a summary by selecting sentences that deliver the major or topic-relevant information in the original documents. How to improve the automatic summarization performance and apply it to newly emerging problems are two valuable research directions. To assist people to capture the semantics of documents effectively and efficiently, the dissertation focuses on developing effective data mining and machine learning algorithms and systems for (1) integrating document clustering and summarization to obtain meaningful document clusters with summarized interpretation, (2) improving document summarization performance and building document understanding systems to solve real-world applications, and (3) summarizing the differences and evolution of multiple document sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the rapid advances in computing and sensing technologies, enormous amounts of data are being generated everyday in various applications. The integration of data mining and data visualization has been widely used to analyze these massive and complex data sets to discover hidden patterns. For both data mining and visualization to be effective, it is important to include the visualization techniques in the mining process and to generate the discovered patterns for a more comprehensive visual view. In this dissertation, four related problems: dimensionality reduction for visualizing high dimensional datasets, visualization-based clustering evaluation, interactive document mining, and multiple clusterings exploration are studied to explore the integration of data mining and data visualization. In particular, we 1) propose an efficient feature selection method (reliefF + mRMR) for preprocessing high dimensional datasets; 2) present DClusterE to integrate cluster validation with user interaction and provide rich visualization tools for users to examine document clustering results from multiple perspectives; 3) design two interactive document summarization systems to involve users efforts and generate customized summaries from 2D sentence layouts; and 4) propose a new framework which organizes the different input clusterings into a hierarchical tree structure and allows for interactive exploration of multiple clustering solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sub-optimal recovery of bacterial DNA from whole blood samples can limit the sensitivity of molecular assays to detect pathogenic bacteria. We compared 3 different pre-lysis protocols (none, mechanical pre-lysis and achromopeptidasepre-lysis) and 5 commercially available DNA extraction platforms for direct detection of Group B Streptococcus (GBS) in spiked whole blood samples, without enrichment culture. DNA was extracted using the QIAamp Blood Mini kit (Qiagen), UCP Pathogen Mini kit (Qiagen), QuickGene DNA Whole Blood kit S (Fuji), Speed Xtract Nucleic Acid Kit 200 (Qiagen) and MagNA Pure Compact Nucleic Acid Isolation Kit I (Roche Diagnostics Corp). Mechanical pre-lysis increased yields of bacterial genomic DNA by 51.3 fold (95% confidence interval; 31.6–85.1, p < 0.001) and pre-lysis with achromopeptidase by 6.1 fold (95% CI; 4.2–8.9, p < 0.001), compared with no pre-lysis. Differences in yield dueto pre-lysis were 2–3 fold larger than differences in yield between extraction methods. Including a pre-lysis step can improve the limits of detection of GBS using PCR or other molecular methods without need for culture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

β-methylamino-L-alanine (BMAA) is a neurotoxin linked to neurodegeneration, which is manifested in the devastating human diseases amyotrophic lateral sclerosis, Alzheimer’s and Parkinson’s disease. This neurotoxin is known to be produced by almost all tested species within the cyanobacterial phylum including free living as well as the symbiotic strains. The global distribution of the BMAA producers ranges from a terrestrial ecosystem on the Island of Guam in the Pacific Ocean to an aquatic ecosystem in Northern Europe, the Baltic Sea, where annually massive surface blooms occur. BMAA had been shown to accumulate in the Baltic Sea food web, with highest levels in the bottom dwelling fish-species as well as in mollusks. One of the aims of this thesis was to test the bottom-dwelling bioaccumulation hypothesis by using a larger number of samples allowing a statistical evaluation. Hence, a large set of fish individuals from the lake Finjasjön, were caught and the BMAA concentrations in different tissues were related to the season of catching, fish gender, total weight and species. The results reveal that fish total weight and fish species were positively correlated with BMAA concentration in the fish brain. Therefore, significantly higher concentrations of BMAA in the brain were detected in plankti-benthivorous fish species and heavier (potentially older) individuals. Another goal was to investigate the potential production of BMAA by other phytoplankton organisms. Therefore, diatom cultures were investigated and confirmed to produce BMAA, even in higher concentrations than cyanobacteria. All diatom cultures studied during this thesis work were show to contain BMAA, as well as one dinoflagellate species. This might imply that the environmental spread of BMAA in aquatic ecosystems is even higher than previously thought. Earlier reports on the concentration of BMAA in different organisms have shown highly variable results and the methods used for quantification have been intensively discussed in the scientific community. In the most recent studies, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has become the instrument of choice, due to its high sensitivity and selectivity. Even so, different studies show quite variable concentrations of BMAA. In this thesis, three of the most common BMAA extraction protocols were evaluated in order to find out if the extraction could be one of the sources of variability. It was found that the method involving precipitation of proteins using trichloroacetic acid gave the best performance, complying with all in-house validation criteria. However, extractions of diatom and cyanobacteria cultures with this validated method and quantified using LC-MS/MS still resulted in variable BMAA concentrations, which suggest that also biological reasons contribute to the discrepancies. The current knowledge on the environmental factors that can induce or reduce BMAA production is still limited. In cyanobacteria, production of BMAA was earlier shown to be negative correlated with nitrogen availability – both in laboratory cultures as well as in natural populations. Based on this observation, it was suggested that in unicellular non-diazotrophic cyanobacteria, BMAA might take part in nitrogen metabolism. In order to find out if BMAA has a similar role in diatoms, BMAA was added to two diatom species in culture, in concentrations corresponding to those earlier found in the diatoms. The results suggest that BMAA might induce a nitrogen starvation signal in diatoms, as was earlier observed in cyanobacteria. However, diatoms recover shortly by the extracellular presence of excreted ammonia. Thus, also in diatoms, BMAA might be involved in the nitrogen balance in the cell.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to assess worker exposure to mineral dust particles and a metabolic model, based on the model adopted by ICRP, was applied to assess human exposure to Ta, and predicted values of Ta concentrations in excreta. The occupational exposure to Th, U, Nb, and Ta bearing particles during routine tasks to obtain Fe-Nb alloys was estimated using air samplers and excreta samples. Ta concentrations in food samples and in drinking water were also determined. The results support that workers were occupationally exposed to Ta bearing particles, and also indicate that a source of Ta exposure for both workers and the control group was the ingestion of drinking water containing soluble compounds of Ta. Therefore, some Ta compounds should be considered soluble compounds in gastrointestinal tract. Consequently the metabolic model based on ICRP metabolic model and/or the transfer factor f1 for Ta should be reviewed and the solubility of Ta compounds in gastrointestinal should be determined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper develops a Dynamic Stochastic General Equilibrium (DSGE) model, which assesses the macroeconomic and labor market effects derived from simulating a positive shock to the stochastic component of the mining-energy sector productivity. Calibrating the model for the Colombian economy, this shock generates a whole increase in formal wages and a raise in tax revenues, expanding total consumption of the household members. These facts increase non-tradable goods prices relative to tradable goods prices, then real exchange rate decreases (appreciation) and occurs a displacement of productive resources from the tradable (manufacturing) sector to the non-tradable sector, followed by an increase in formal GDP and formal job gains. This situation makes the formal sector to absorb workers from the informal sector through the non-tradable formal subsector, which causes informal GDP to go down. As a consequence, in the net consumption falls for informal workers, which leads some members of the household not to offer their labor force in the informal sector but instead they prefer to keep unemployed. Therefore, the final result on the labor market is a decrease in the number of informal workers, of which a part are in the formal sector and the rest are unemployed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This text is an ideal starting point to understand the regulatory regimes and policy challenges relevant to Australia's mining sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stress relaxation is relevant to the design of both civil and mining excavations. While many authors refer to the adverse effect of stress relaxation on excavation stability, some present compelling empirical evidence indicating that stress relaxation does not have a significant effect. Establishing clear definitions of stress relaxation was critical to understanding and quantifying stress relaxation of the various types that have been referred to in the literature. This paper defines three types of stress relaxation – partial relaxation, full relaxation and tangential relaxation. Once clear definitions were determined, it became clear that the theoretical arguments and empirical evidence presented by various authors to support their respective cases are not contradictory; rather, the different conclusions can be attributed to different types of stress relaxation. In particular, when the minor principal stress is negative the intermediate principal stress has been identified as significantly affecting jointed rock mass behaviour. The aim of the study was to review and evaluate existing methods of quantifying the effect of stress relaxation around underground excavations and, if necessary, propose a new set of recommendations. An empirical stope stability model, that has been termed the Extended Mathews stability chart, was considered to be the most appropriate method of quantifying the effects of stress relaxation. A new set of guidelines to account for the effect of stress relaxation on excavation stability in the Extended Mathews stability chart has been proposed from a back-analysis of 55 case histories of stress relaxation.

Relevância:

100.00% 100.00%

Publicador: