945 resultados para 640200 Primary Mining and Extraction Processes
Resumo:
The Exhibitium Project , awarded by the BBVA Foundation, is a data-driven project developed by an international consortium of research groups . One of its main objectives is to build a prototype that will serve as a base to produce a platform for the recording and exploitation of data about art-exhibitions available on the Internet . Therefore, our proposal aims to expose the methods, procedures and decision-making processes that have governed the technological implementation of this prototype, especially with regard to the reuse of WordPress (WP) as development framework.
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.
Resumo:
Water Distribution Networks (WDNs) play a vital importance rule in communities, ensuring well-being band supporting economic growth and productivity. The need for greater investment requires design choices will impact on the efficiency of management in the coming decades. This thesis proposes an algorithmic approach to address two related problems:(i) identify the fundamental asset of large WDNs in terms of main infrastructure;(ii) sectorize large WDNs into isolated sectors in order to respect the minimum service to be guaranteed to users. Two methodologies have been developed to meet these objectives and subsequently they were integrated to guarantee an overall process which allows to optimize the sectorized configuration of WDN taking into account the needs to integrated in a global vision the two problems (i) and (ii). With regards to the problem (i), the methodology developed introduces the concept of primary network to give an answer with a dual approach, of connecting main nodes of WDN in terms of hydraulic infrastructures (reservoirs, tanks, pumps stations) and identifying hypothetical paths with the minimal energy losses. This primary network thus identified can be used as an initial basis to design the sectors. The sectorization problem (ii) has been faced using optimization techniques by the development of a new dedicated Tabu Search algorithm able to deal with real case studies of WDNs. For this reason, three new large WDNs models have been developed in order to test the capabilities of the algorithm on different and complex real cases. The developed methodology also allows to automatically identify the deficient parts of the primary network and dynamically includes new edges in order to support a sectorized configuration of the WDN. The application of the overall algorithm to the new real case studies and to others from literature has given applicable solutions even in specific complex situations.
Resumo:
The prognostic relevance of different molecular markers in lung cancer is a crucial issue still worth investigating, and the specimens collected and analyzed represent a valuable source of material. Cyclin-D1, c-erbB-2 and vascular endothelial growth factor (VEGF) have shown to be promising as prognosticators in human cancer. In this study, we sought to examine the importance of Cyclin-D1, c-erbB-2 and VEGF, and to study the quantitative relationship among these factors and disease progression in metastases vs corresponding primary cancer, and metastatic vs non metastatic cancers. Material and Methods: We used immunohistochemistry and morphometric analysis to evaluate the amount of tumour staining for Cyclin-D1, c-erbB-2 and VEGF in 52 patients with surgically excised ademocarcinoma of the lung, and the outcome for our study was survival time until death from hematogenic metastases. Results: Metastasis presented lower c-erbB-2 expression than corresponding primary cancers (p=0.02). Cyclin-D1 and VEGF expression were also lower in metastases than in corresponding primary cancers, but this difference did not achieve statistical significance. Non-metastatic cancers also presented significantly lower Cyclin-D1 and c-erbB-2 expression than metastatic cancers (p<0.01 and p<0.01, respectively). Equally significant was the difference between higher c-erbB-2 expression by metastatic cancers compared to non-metastatic cancers (p=0.02). Considering survival in Kaplan-Maier analysis, Cyclin-D1 (p=0.04), c-erbB-2 (p=0.04) and VEGF (p<0.01) were important predictors of survival in metastatic cancers.
Resumo:
A century after its discovery, Chagas' disease still represents a major public health challenge in Latin America. Moreover, because of growing population movements, an increasing number of cases of imported Chagas' disease have now been detected in non-endemic areas, such as North America and some European countries. This parasitic zoonosis, caused by Trypanosoma cruzi, is transmitted to humans by infected Triatominae insects, or occasionally by non-vectorial mechanisms, such as blood transfusion, mother to fetus, or oral ingestion of materials contaminated with parasites. Following the acute phase of the infection, untreated individuals enter a chronic phase that is initially asymptomatic or clinically unapparent. Usually, a few decades later, 40-50% of patients develop progressive cardiomyopathy and/or motility disturbances of the oesophagus and colon. In the last decades several interventions targeting primary, secondary and tertiary prevention of Chagas' disease have been attempted. While control of both vectorial and blood transfusion transmission of T cruzi (primary prevention) has been successful in many regions of Latin America, early detection and aetiological treatment of asymptomatic subjects with Chagas' disease (secondary prevention) have been largely underutilised. At the same time, in patients with established chronic disease, several pharmacological and non-pharmacological interventions are currently available and have been increasingly used with the intention of preventing or delaying complications of the disease (tertiary prevention). In this review we discuss in detail each of these issues.
Combined photocatalytic and fungal processes for the treatment of nitrocellulose industry wastewater
Resumo:
The objective of this work was to characterize the delignification effluent originating from the delignification industry and evaluate the combination of the fungus and photocatalytic process (TiO(2)/UV system) for the treatment of this effluent. The delignification effluent has proven harmful to the environment because it presents high color (3516 CU), total phenol (876 mg/L and TOC (1599 mg/L) and is also highly toxic even in a low concentration. The results of photocatalysis were 11%, 25% and 13% higher for reductions in color, total phenol and TOC, respectively. The combined treatments presented benefits when compared to the non-combined treatments. Fungus and photocatalysis in combination proved to be the best treatment, reducing the color, total phenol, toxicity (inhibition of Escherichia coli growth) and TOC by 94.2%, 92.6%, 4.9% and 62%, respectively. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
This study deals with two innovative brewing processes, high gravity batch and complete continuous beer fermentation systems. The results show a significant influence of the variables such as concentration and temperature on the yield factor of the substrate into ethanol and consequently on the productivity of the high gravity batch process. The technological feasibility of continuous production of beer based on yeast immobilization on cheap alternative carriers was also demonstrated. The influence of process parameters on fermentation performance and quality of the obtained beers was studied by sensorial analysis. No significant difference in the degree of acceptance between the obtained products and some traditional market brands was found. (c) 2008 Institute of Chemistry, Slovak Academy of Sciences.
Resumo:
The objective of this work was to study the operational feasibility of nitrification and denitrification processes in a mechanically stirred sequencing batch reactor (SBR) operated in batch and fed-batch mode. The reactor was equipped with a draft-tube to improve mass transfer and contained dispersed (aerobic) and granulated (anaerobic) biomass. The following reactor variables were adjusted: aeration time during the nitrification step; dissolved oxygen concentration, feed time defining batch and fed-batch phases, concentration of external carbon source used as electron donor during the denitrification stage and volumetric ammonium nitrogen load in the influent. The reactor (5 L volume) was maintained at 30 +/- 1 degrees C and treated either 1.0 or 1.5 L wastewater in 8-h cycles. Ammonium nitrogen concentrations assessed were: 50 (condition 1) and 100 mgN-NH(4)(+).L(-1) (condition 2), resulting in 29 and 67 mgN-NH(4)(+).L-1-d(-1), respectively. A synthetic medium and ethanol were used as external carbon sources (ECS). Total nitrogen removal efficiencies were 94.4 and 95.9% when the reactor was operated under conditions 1 and 2, respectively. Low nitrite (0.2 and 0.3 mgN-NO(2)(-).L(-1), respectively) and nitrate (0.01 and 0.3 mgN-NO(3)(-).L(-1), respectively) concentrations were detected in the effluent and ammonium nitrogen removal efficiencies were 97.6% and 99.6% under conditions 1 and 2, respectively.
Resumo:
The present study approaches the economic and technical evaluation of equivalent carbon dioxide (CO(2) eqv.) capture and storage processes, considered in a proposal case compared to a base case. The base case considers an offshore petroleum production facility, with high CO(2) content (4 vol%) in the composition of the produced gas and both CO(2) and natural gas emissions to the atmosphere, called CO(2) eqv. emissions. The results obtained with this study, by using a Hysys process simulator, showed a CO(2) emission reduction of 65% comparing the proposal case in relation to the base case.
Resumo:
Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).
Resumo:
In quantum measurement theory it is necessary to show how a, quantum source conditions a classical stochastic record of measured results. We discuss mesoscopic conductance using quantum stochastic calculus to elucidate the quantum nature of the measurement taking place in these systems. To illustrate the method we derive the current fluctuations in a two terminal mesoscopic circuit with two tunnel barriers containing a single quasi bound state on the well. The method enables us to focus on either the incoming/ outgoing Fermi fields in the leads, or on the irreversible dynamics of the well state itself. We show an equivalence between the approach of Buttiker and the Fermi quantum stochastic calculus for mesoscopic systems.
Resumo:
Emotional accounts of startle modulation predict that startle is facilitated if elicited during aversive foreground stimuli. Attentional accounts hold that startle is enhanced if startle-eliciting stimulus and foreground stimulus are in the same modality. Visual and acoustic foreground stimuli and acoustic startle probes were employed in aversive differential conditioning and in a stimulus discrimination task. Differential conditioning was evident in electrodermal responses and blink latency shortening in both modalities, but effects on magnitude facilitation were found only for visual stimuli. In the discrimination task, skin conductance responses, blink latency shortening, and blink magnitude facilitation were larger during to-be-attended stimuli regardless of stimulus modality. The present results support the notion that attention and emotion can affect blink startle modulation during foreground stimuli.