897 resultados para 091402 Geomechanics and Resources Geotechnical Engineering
Resumo:
Blasting has been the most frequently used method for rock breakage since black powder was first used to fragment rocks, more than two hundred years ago. This paper is an attempt to reassess standard design techniques used in blasting by providing an alternative approach to blast design. The new approach has been termed asymmetric blasting. Based on providing real time rock recognition through the capacity of measurement while drilling (MWD) techniques, asymmetric blasting is an approach to deal with rock properties as they occur in nature, i.e., randomly and asymmetrically spatially distributed. It is well accepted that performance of basic mining operations, such as excavation and crushing rely on a broken rock mass which has been pre conditioned by the blast. By pre-conditioned we mean well fragmented, sufficiently loose and with adequate muckpile profile. These muckpile characteristics affect loading and hauling [1]. The influence of blasting does not end there. Under the Mine to Mill paradigm, blasting has a significant leverage on downstream operations such as crushing and milling. There is a body of evidence that blasting affects mineral liberation [2]. Thus, the importance of blasting has increased from simply fragmenting and loosing the rock mass, to a broader role that encompasses many aspects of mining, which affects the cost of the end product. A new approach is proposed in this paper which facilitates this trend 'to treat non-homogeneous media (rock mass) in a non-homogeneous manner (an asymmetrical pattern) in order to achieve an optimal result (in terms of muckpile size distribution).' It is postulated there are no logical reasons (besides the current lack of means to infer rock mass properties in the blind zones of the bench and onsite precedents) for drilling a regular blast pattern over a rock mass that is inherently heterogeneous. Real and theoretical examples of such a method are presented.
Resumo:
Blast fragmentation can have a significant impact on the profitability of a mine. An optimum run of mine (ROM) size distribution is required to maximise the performance of downstream processes. If this fragmentation size distribution can be modelled and controlled, the operation will have made a significant advancement towards improving its performance. Blast fragmentation modelling is an important step in Mine to Mill™ optimisation. It allows the estimation of blast fragmentation distributions for a number of different rock mass, blast geometry, and explosive parameters. These distributions can then be modelled in downstream mining and milling processes to determine the optimum blast design. When a blast hole is detonated rock breakage occurs in two different stress regions - compressive and tensile. In the-first region, compressive stress waves form a 'crushed zone' directly adjacent to the blast hole. The second region, termed the 'cracked zone', occurs outside the crush one. The widely used Kuz-Ram model does not recognise these two blast regions. In the Kuz-Ram model the mean fragment size from the blast is approximated and is then used to estimate the remaining size distribution. Experience has shown that this model predicts the coarse end reasonably accurately, but it can significantly underestimate the amount of fines generated. As part of the Australian Mineral Industries Research Association (AMIRA) P483A Mine to Mill™ project, the Two-Component Model (TCM) and Crush Zone Model (CZM), developed by the Julius Kruttschnitt Mineral Research Centre (JKMRC), were compared and evaluated to measured ROM fragmentation distributions. An important criteria for this comparison was the variation of model results from measured ROM in the-fine to intermediate section (1-100 mm) of the fragmentation curve. This region of the distribution is important for Mine to Mill™ optimisation. The comparison of modelled and Split ROM fragmentation distributions has been conducted in harder ores (UCS greater than 80 MPa). Further work involves modelling softer ores. The comparisons will be continued with future site surveys to increase confidence in the comparison of the CZM and TCM to Split results. Stochastic fragmentation modelling will then be conducted to take into account variation of input parameters. A window of possible fragmentation distributions can be compared to those obtained by Split . Following this work, an improved fragmentation model will be developed in response to these findings.
Resumo:
A variabilidade natural dos solos torna complexo o conhecimento de suas propriedades na elaboração de projetos geotécnicos, sendo a determinação da resistência ao cisalhamento não drenada um parâmetro importante nas análises de estabilidade de solos moles. Os ensaios de laboratório de cone e palheta, não convencionais, os ensaios de campo de palheta e piezocone e os ensaios de compressão simples e triaxial não adensado e não drenado foram utilizados para mensurar a resistência não drenada de uma camada de argila marinha mole localizada na planície costeira central brasileira. Os ensaios de laboratório foram realizados em amostras indeformadas coletadas com amostradores de pistão estacionário em vertical próxima à realização dos ensaios de campo. O sítio foi investigado preliminarmente por sondagens de simples reconhecimento, sendo apresentado o perfil estratigráfico por meio de modelagem computacional. Foram também realizados ensaios para caracterização física (análise granulométrica, teor de umidade, limites de liquidez e plasticidade, densidade real dos grãos) e mineralógica (difração de raios X), e ensaios de adensamento para obtenção do histórico de tensões e classificação de qualidade das amostras indeformadas. Os valores de resistência não drenada obtidos pelos ensaios de laboratório foram comparados ao perfil contínuo de resistência determinado empiricamente pelo ensaio de piezocone, com fator de cone Nkt calibrado pelo ensaio de palheta de campo, apresentando boa concordância, com a variabilidade natural do solo influenciando de forma preponderante a qualidade das amostras na variação entre os resultados. Os valores de resistência obtidos pelos ensaios de laboratório de cone e palheta foram comparados entre si, apresentando boa compatibilidade. Ambos, quando comparados ao ensaio de palheta de campo, não apresentaram boa concordância. Os resultados de resistência obtidos pelos ensaios de compressão simples e triaxial apresentaram boa compatibilidade com os resultados do ensaio de laboratório de cone, o que não ocorreu com os resultados do ensaio de laboratório de palheta. Na comparação entre a resistência normalizada pela tensão de sobreadensamento obtida pelos diversos métodos e algumas correlações empíricas da literatura internacional, foi observado para as amostras de solo com índice de plasticidade superior a 60% boa concordância com as correlações de Mesri (1975) e Jamiolkowski et al (1985). Os ensaios não convencionais apresentaram boa confiabilidade, que aliado a simplicidade e agilidade de execução, justificam a difusão destes na prática da investigação geotécnica brasileira como método alternativo para complementar e dar suporte às estimativas de resistência não drenada de solos moles.
Resumo:
The real Cloud and Ubiquitous Manufacturing systems require effectiveness and permanent availability of resources, their capacity and scalability. One of the most important problems for applications management over cloud based platforms, which are expected to support efficient scalability and resources coordination following SaaS implementation model, is their interoperability. Even application dashboards need to easily incorporate those new applications, their interoperability still remains a big problem to override. So, the possibility to expand these dashboards with efficiently integrated communicational cloud based services (cloudlets) represents a relevant added value as well as contributes to solving the interoperability problem. Following the architecture for integration of enriched existing cloud services, as instances of manufacturing resources, this paper: a) proposes a cloud based web platform to support dashboard integrating communicational services, and b) describe an experimentation to sustain the theory that the effective and efficient interoperability, especially in dynamic environments, could be achieved only with human intervention.
Resumo:
The calls for colleges and universities to improve their productivity are coming thick and fast in Brazil. Many studies are suggesting evaluation systems and external criteria to control the quality of teaching and research in universities. Since universities and colleges are not profit-oriented organizations (considering only the legitimate and serious research and teaching organizations, of course), the traditional microeconomics and administrative variables used to measure efficiency do not have any direct function. An alternative would be to create an "as if" market control system to evaluate performance in universities and colleges. Internal budget and resources allocation mechanism can be used as incentive instruments to improve quality and productivity. It will be the main issue of this article.
Resumo:
This paper shows the results of the empirical study conducted in 186 tourist accommodation businesses in Spain certified under the "Q for Tourist Quality", own System Quality Management. It was raised with the purpose of analyzing the structure of the relationship between critical quality factors and results-social impact, how they operate and the level of their influence on obtaining these results within the company. Starting from a deep theoretical revision we propose a theoretical model together with the hypotheses to be tested, and we proceed to validation using the technique of Structural Equation Models. The results obtained show that companies wishing to improve their social impact should take into account that leadership is the most important factor to achieve it. Leadership indirectly affects the social impact through its influence on alliances and resources, quality policy/planning, personnel management and learning.
Resumo:
Solubility measurements of quinizarin. (1,4-dihydroxyanthraquinone), disperse red 9 (1-(methylamino) anthraquinone), and disperse blue 14 (1,4-bis(methylamino)anthraquinone) in supercritical carbon dioxide (SC CO2) were carried out in a flow type apparatus, at a temperature range from (333.2 to 393.2) K and at pressures from (12.0 to 40.0) MPa. Mole fraction solubility of the three dyes decreases in the order quinizarin (2.9 x 10(-6) to 2.9.10(-4)), red 9 (1.4 x 10(-6) to 3.2 x 10(-4)), and blue 14 (7.8 x 10(-8) to 2.2 x 10(-5)). Four semiempirical density based models were used to correlatethe solubility of the dyes in the SC CO2. From the correlation results, the total heat of reaction, heat of vaporization plus the heat of solvation of the solute, were calculated and compared with the results presented in the literature. The solubilities of the three dyes were correlated also applying the Soave-Redlich-Kwong cubic equation of state (SRK CEoS) with classical mixing rules, and the physical properties required for the modeling were estimated and reported.
Resumo:
Renewable based power generation has significantly increased over the last years. However, this process has evolved separately from electricity markets, leading to an inadequacy of the present market models to cope with huge quantities of renewable energy resources, and to take full advantage of the presently existing and the increasing envisaged renewable based and distributed energy resources. This paper proposes the modelling of electricity markets at several levels (continental, regional and micro), taking into account the specific characteristics of the players and resources involved in each level and ensuring that the proposed models accommodate adequate business models able to support the contribution of all the resources in the system, from the largest to the smaller ones. The proposed market models are integrated in MASCEM (Multi- Agent Simulator of Competitive Electricity Markets), using the multi agent approach advantages for overcoming the current inadequacy and significant limitations of the presently existing electricity market simulators to deal with the complex electricity market models that must be adopted.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
Mestrado em Engenharia Geotécnica e Geoambiente
Resumo:
Objectives - Review available guidance for quality assurance (QA) in mammography and discuss its contribution to harmonise practices worldwide. Methods - Literature search was performed on different sources to identify guidance documents for QA in mammography available worldwide in international bodies, healthcare providers, professional/scientific associations. The guidance documents identified were reviewed and a selection was compared for type of guidance (clinical/technical), technology and proposed QA methodologies focusing on dose and image quality (IQ) performance assessment. Results - Fourteen protocols (targeted at conventional and digital mammography) were reviewed. All included recommendations for testing acquisition, processing and display systems associated with mammographic equipment. All guidance reviewed highlighted the importance of dose assessment and testing the Automatic Exposure Control (AEC) system. Recommended tests for assessment of IQ showed variations in the proposed methodologies. Recommended testing focused on assessment of low-contrast detection, spatial resolution and noise. QC of image display is recommended following the American Association of Physicists in Medicine guidelines. Conclusions - The existing QA guidance for mammography is derived from key documents (American College of Radiology and European Union guidelines) and proposes similar tests despite the variations in detail and methodologies. Studies reported on QA data should provide detail on experimental technique to allow robust data comparison. Countries aiming to implement a mammography/QA program may select/prioritise the tests depending on available technology and resources.
Resumo:
Dissertação de natureza científica para obtenção do grau de Mestre em Engenharia Civil
Resumo:
A expansão da área ocupada pelo Porto de Leixões (Matosinhos e Leça da Palmeira), sobre solos muito compressíveis, de origem fluvial e marinha, leva a que seja necessário recorrer à engenharia para encontrar soluções adequadas à utilização de obras dos fins em vista. Assim, na zona do porto são muitos os projectos de engenharia civil/geotécnica executados nestes solos. Como exemplo, podem citar-se a Consolidação do Terrapleno e Construção dos Caminhos de Rolamento do Terminal de Contentores TC4S, a Reabilitação de um troço com 110m, do Cais Sul e do Cais Nascente da Doca nº 4 e a Construção da Portaria Principal do Porto de Leixões, todos no vale fóssil do rio Leça. São vastos os métodos a usar para o melhoramento destes solos, a colocação de colunas de brita, com o objectivo de reforçar o solo, aumentando a sua capacidade de carga e funcionando como drenos verticais, para solucionar o problema das deformações excessivas durante e após o final da obra, uma alternativa consiste em induzir a aceleração da consolidação da camada de solo mole, o uso de pré-carregamento e drenos verticais são usuais. Quando o tempo de concretização da obra exige que o aterro seja utilizado de imediato, uma solução viável é a colocação de estacas, que transferem o peso do aterro, ou parte dele, para camadas mais competentes. Também se pode proceder à retirada do solo original e substituí-lo por outro de qualidade superior. A mais recente técnica de melhoria de solos por injecção - jet grouting - é utilizada em diversas situações, incluindo obras provisórias e definitivas. O presente trabalho visa descrever, em função dos diversos factores, o comportamento do solo face aos vários métodos utilizados e os objectivos pretendidos que serão abordados no enquadramento empírico do trabalho.
Resumo:
Actualmente, os smartphones e outros dispositivos móveis têm vindo a ser dotados com cada vez maior poder computacional, sendo capazes de executar um vasto conjunto de aplicações desde simples programas de para tirar notas até sofisticados programas de navegação. Porém, mesmo com a evolução do seu hardware, os actuais dispositivos móveis ainda não possuem as mesmas capacidades que os computadores de mesa ou portáteis. Uma possível solução para este problema é distribuir a aplicação, executando partes dela no dispositivo local e o resto em outros dispositivos ligados à rede. Adicionalmente, alguns tipos de aplicações como aplicações multimédia, jogos electrónicos ou aplicações de ambiente imersivos possuem requisitos em termos de Qualidade de Serviço, particularmente de tempo real. Ao longo desta tese é proposto um sistema de execução de código remota para sistemas distribuídos com restrições de tempo-real. A arquitectura proposta adapta-se a sistemas que necessitem de executar periodicamente e em paralelo mesmo conjunto de funções com garantias de tempo real, mesmo desconhecendo os tempos de execução das referidas funções. A plataforma proposta foi desenvolvida para sistemas móveis capazes de executar o Sistema Operativo Android.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Área de Especialização em Vias de Comunicação e Transportes