958 resultados para Bowker Collection Analysis Tool
Resumo:
As unidades de beneficiamento de minério de ouro buscam cada vez mais uma produção de baixo custo e maximização dos ganhos financeiros. A caracterização tecnológica está inserida em uma abordagem multidisciplinar que permite agregar conhecimento, alternativas de otimização e redução nos custos de operação. Inserida como uma ferramenta na caracterização tecnológica, a análise de imagens automatizada tem importante papel no setor mineral principalmente pela rapidez das análises, robustez estatística e confiabilidade nos resultados. A técnica pode ser realizada por meio de imagens adquiridas em microscópio eletrônico de varredura, associada a microanálises químicas sendo utilizada em diversas etapas de um empreendimento mineiro. Este estudo tem como objetivo a caraterização tecnológica de minério de ouro da Mina Morro do Ouro, Minas Gerais na qual foi utilizado a técnica de análise de imagens automatizada por MLA em um conjunto de 88 amostras. Foi possível identificar que 90% do ouro está na fração acima de 0,020 mm; o quartzo e mica representam cerca de 80% da massa total do minério; os sulfetos apresentam diâmetro de círculo equivalente entre 80 e 100 ?m e são representados por pirita e arsenopirita, com pirrotita, calcopirita, esfalerita e galena subordinada. Também foi possível observar que o ouro está majoritariamente associado à pirita e arsenopirita e com o aumento de teor de arsênio, cresce a parcela de ouro associado à arsenopirita. As medianas das distribuições de tamanho dos grãos de ouro apresentam um valor médio de 19 ?m. Verificou-se que a composição dos grãos de ouro é bastante diversa, em média 77% de ouro e 23% de prata. Para material abaixo de 0,50 mm observa-se uma parcela expressiva de perímetro exposto dos grãos de ouro (média 73%); o ouro incluso (21% do total dos grãos de ouro) está associado a pirita e arsenopirita, sendo que em 14 das 88 amostras este valor pode superar 40% do total de ouro contido. A ferramenta da análise de imagens automatizada mostrou-se bastante eficiente definindo características particulares o que fornece de forma objetiva subsídios para os trabalhos de planejamento de mina e processamento mineral.
Resumo:
O mercado consumidor passou por diversas transformações ao longo do tempo devido principalmente à evolução tecnológica. A evolução tecnológica proporcionou ao consumidor a possibilidade de escolher por produtos e marcas, e permite a oportunidade de colaborar e influenciar a opinião de outros consumidores através do compartilhamento de experiências, principalmente através da utilização de plataformas digitais. O CRM (gerenciamento do relacionamento com o consumidor) é a forma utilizada pelas empresas para conhecerem o consumidor e criar um relacionamento satisfatório entre empresa e consumidor. Esse relacionamento tem o intuito de satisfazer e fidelizar o consumidor, evitando que ele deixe de consumir a marca e evitando que ele influencie negativamente outros consumidores. O e-CRM é o gerenciamento eletrônico do relacionamento com o consumidor, que possui todas as tradicionais características do CRM, porém com o incremento do ambiente digital. O ambiente digital diminuiu a distância entre pessoas e empresas e se tornou um meio colaborativo de baixo custo de interação com o consumidor. Por outro lado, este é um meio onde o consumidor deixa de ser passivo e se torna ativo, o que o torna capaz de influenciar não só um pequeno grupo de amigos, mas toda uma rede de consumidores. A digital analytics é a medição, coleta, análise e elaboração de relatórios de dados digitais para os propósitos de entendimento e otimização da performance em negócios. A utilização de dados digitais auxilia no desenvolvimento do e-CRM através da compreensão do comportamento do consumidor em um ambiente onde o consumidor é ativo. O ambiente digital permite um conhecimento mais detalhado dos consumidores, baseado não somente nos hábitos de compra, mas também nos interesses e interações. Este estudo tem como objetivo principal compreender como as empresas aplicam os conceitos do e-CRM em suas estratégias de negócios, compreendendo de que forma a digital analytics contribui para o desenvolvimento do e-CRM, e compreendendo como os fatores críticos de sucesso (humano, tecnológico e estratégico) impactam na implantação e desenvolvimento do e-CRM. Quatro empresas de diferentes segmentos foram estudadas através da aplicação de estudo de caso. As empresas buscam cada vez mais explorar as estratégias de e-CRM no ambiente digital, porém existem limitações identificadas devido à captação, armazenamento e análise de informações multicanais, principalmente considerando os canais digitais. Outros fatores como o apoio da alta direção e a compreensão de funcionários para lidar com estratégias focadas no consumidor único também foram identificados neste estudo. O estudo foi capaz de identificar as informações mais relevantes para a geração de estratégias de gerenciamento eletrônico do relacionamento com o consumidor e identificou os aspectos mais relevantes dos fatores críticos de sucesso.
Resumo:
The Eastern Academic Scholars’ Trust (EAST) is a shared print initiative involving 48 libraries across the Northeast. Initiated in 2012 with a planning grant from the Andrew W. Mellon Foundation, EAST addresses directly the growing need for academic libraries to ensure that monographs and journals of scholarly value are not inadvertently discarded as they undertake weeding and deselection programs to free up space for other library services. EAST is currently completing a large-scale analysis of collections across 40 of the participating libraries. This analysis will provide insight into both uniqueness and overlap across the libraries’ holdings and will result in agreements by the libraries to retain circulating monographs in their local collections for an agreed upon time period and to make those materials available to researchers and scholars from other EAST libraries. In parallel to this collection analysis, EAST is implementing validation sampling across the libraries to better understand volume availability and condition and the role they may plan in retention decisions. The project team has developed an innovative sampling methodology and tools to support the study. As the largest shared print initiative to date, this project will secure a substantial portion of the scholarly record that is held in the Northeast and positions EAST as an important component of the growing network of shared print initiatives nationally.
Resumo:
Diversity-based designing, or the goal of ensuring that web-based information is accessible to as many diverse users as possible, has received growing international acceptance in recent years, with many countries introducing legislation to enforce it. This paper analyses web content accessibility levels in Spanish education portals according to the international guidelines established by the World Wide Web Consortium (W3C) and the Web Accessibility Initiative (WAI). Additionally, it suggests the calculation of an inaccessibility rate as a tool for measuring the degree of non-compliance with WAI Guidelines 2.0 as well as illustrating the significant gap that separates people with disabilities from digital education environments (with a 7.77% average). A total of twenty-one educational web portals with two different web depth levels (42 sampling units) were assessed for this purpose using the automated analysis tool Web Accessibility Test 2.0 (TAW, for its initials in Spanish). The present study reveals a general trend towards non-compliance with the technical accessibility recommendations issued by the W3C-WAI group (97.62% of the websites examined present mistakes in Level A conformance). Furthermore, despite the increasingly high number of legal and regulatory measures about accessibility, their practical application still remains unsatisfactory. A greater level of involvement must be assumed in order to raise awareness and enhance training efforts towards accessibility in the context of collective Information and Communication Technologies (ICTs), since this represents not only a necessity but also an ethical, social, political and legal commitment to be assumed by society.
Resumo:
Implant failures and postoperative complications are often associated to the bone drilling. Estimation and control of drilling parameters are critical to prevent mechanical damage to the bone tissues. For better performance of the drilling procedures, it is essential to understand the mechanical behaviour of bones that leads to their failures and consequently to improve the cutting conditions. This paper investigates the effect of drill speed and feed-rate on mechanical damage during drilling of solid rigid foam materials, with similar mechanical properties to the human bone. Experimental tests were conducted on biomechanical blocks instrumented with strain gauges to assess the drill speed and feed-rate influence. A three-dimensional dynamic finite element model to predict the bone stresses, as a function of drilling conditions, drill geometry and bone model, was developed. These simulations incorporate the dynamic characteristics involved in the drilling process. The element removal scheme is taken into account and allows advanced simulations of tool penetration and material removal. Experimental and numerical results show that generated stresses in the material tend to increase with tool penetration. Higher drill speed leads to an increase of von-Mises stresses and strains in the solid rigid foams. However, when the feed-rate is higher, the stresses and strains are lower. The numerical normal stresses and strains are found to be in good agreement with experimental results. The models could be an accurate analysis tool to simulate the stresses distribution in the bone during the drilling process.
Resumo:
Dynamic spatial analysis addresses computational aspects of space–time processing. This paper describes the development of a spatial analysis tool and modelling framework that together offer a solution for simulating landscape processes. A better approach to integrating landscape spatial analysis with Geographical Information Systems is advocated in this paper. Enhancements include special spatial operators and map algebra language constructs to handle dispersal and advective flows over landscape surfaces. These functional components to landscape modelling are developed in a modular way and are linked together in a modelling framework that performs dynamic simulation. The concepts and modelling framework are demonstrated using a hydrological modelling example. The approach provides a modelling environment for scientists and land resource managers to write and to visualize spatial process models with ease.
Resumo:
Purpose – The purpose of this paper is to consider the current status of strategic group theory in the light of developments over the last three decades. and then to discuss the continuing value of the concept, both to strategic management research and practising managers. Design/methodology/approach – Critical review of the idea of strategic groups together with a practical strategic mapping illustration. Findings – Strategic group theory still provides a useful approach for management research, which allows a detailed appraisal and comparison of company strategies within an industry. Research limitations/ implications – Strategic group research would undoubtedly benefit from more directly comparable, industry-specific studies, with a more careful focus on variable selection and the statistical methods used for validation. Future studies should aim to build sets of industry specific variables that describe strategic choice within that industry. The statistical methods used to identify strategic groupings need to be robust to ensure that strategic groups are not solely an artefact of method. Practical implications – The paper looks specifically at an application of strategic group theory in the UK pharmaceutical industry. The practical benefits of strategic groups as a classification system and of strategic mapping as a strategy development and analysis tool are discussed. Originality/value – The review of strategic group theory alongside alternative taxonomies and application of the concept to the UK pharmaceutical industry.
Resumo:
A multi-chromosome GA (Multi-GA) was developed, based upon concepts from the natural world, allowing improved flexibility in a number of areas including representation, genetic operators, their parameter rates and real world multi-dimensional applications. A series of experiments were conducted, comparing the performance of the Multi-GA to a traditional GA on a number of recognised and increasingly complex test optimisation surfaces, with promising results. Further experiments demonstrated the Multi-GA's flexibility through the use of non-binary chromosome representations and its applicability to dynamic parameterisation. A number of alternative and new methods of dynamic parameterisation were investigated, in addition to a new non-binary 'Quotient crossover' mechanism. Finally, the Multi-GA was applied to two real world problems, demonstrating its ability to handle mixed type chromosomes within an individual, the limited use of a chromosome level fitness function, the introduction of new genetic operators for structural self-adaptation and its viability as a serious real world analysis tool. The first problem involved optimum placement of computers within a building, allowing the Multi-GA to use multiple chromosomes with different type representations and different operators in a single individual. The second problem, commonly associated with Geographical Information Systems (GIS), required a spatial analysis location of the optimum number and distribution of retail sites over two different population grids. In applying the Multi-GA, two new genetic operators (addition and deletion) were developed and explored, resulting in the definition of a mechanism for self-modification of genetic material within the Multi-GA structure and a study of this behaviour.
Resumo:
This thesis reports the findings of three studies examining relationship status and identity construction in the talk of heterosexual women, from a feminist and social constructionist perspective. Semi-structured interviews were conducted with 12 women in study 1 and 13 women for study 2, between the ages of twenty and eighty-seven, discussing their experiences of relationships. All interviews were transcribed and analysed using discourse analysis, by hand and using the Nudist 6 program. The resulting themes create distinct age-related marital status expectations. Unmarried women were aware they had to marry by a ‘certain age’ or face a ‘lonely spinsterhood’. Through marriage women gained a socially accepted position associated with responsibility for others, self-sacrifice, a home-focused lifestyle and relational identification. Divorce was constructed as the consequence of personal faults and poor relationship care, reassuring the married of their own control over their status. Older unmarried women were constructed as deviant and pitiable, occupying social purgatory as a result of transgressing these valued conventions. Study 3 used repertory grid tasks, with 33 women, analysing transcripts and notes alongside numerical data using Web Grid II internet analysis tool, to produce principle components maps demonstrating the relationships between relationship terms and statuses. This study illuminated the consistency with which women of different ages and status saw marriage as their ideal living situation and outlined the domestic responsibilities associated. Spinsters and single-again women were defined primarily by their lack of marriage and by loneliness. This highlighted the devalued position of older unmarried women. The results of these studies indicated a consistent set of age-related expectations of relationship status, acknowledged by women and reinforced by their families and friends, which render many unmarried women deviant and fail to acknowledge the potential variety of women’s ways of living.
Resumo:
Transportation service operators are witnessing a growing demand for bi-directional movement of goods. Given this, the following thesis considers an extension to the vehicle routing problem (VRP) known as the delivery and pickup transportation problem (DPP), where delivery and pickup demands may occupy the same route. The problem is formulated here as the vehicle routing problem with simultaneous delivery and pickup (VRPSDP), which requires the concurrent service of the demands at the customer location. This formulation provides the greatest opportunity for cost savings for both the service provider and recipient. The aims of this research are to propose a new theoretical design to solve the multi-objective VRPSDP, provide software support for the suggested design and validate the method through a set of experiments. A new real-life based multi-objective VRPSDP is studied here, which requires the minimisation of the often conflicting objectives: operated vehicle fleet size, total routing distance and the maximum variation between route distances (workload variation). The former two objectives are commonly encountered in the domain and the latter is introduced here because it is essential for real-life routing problems. The VRPSDP is defined as a hard combinatorial optimisation problem, therefore an approximation method, Simultaneous Delivery and Pickup method (SDPmethod) is proposed to solve it. The SDPmethod consists of three phases. The first phase constructs a set of diverse partial solutions, where one is expected to form part of the near-optimal solution. The second phase determines assignment possibilities for each sub-problem. The third phase solves the sub-problems using a parallel genetic algorithm. The suggested genetic algorithm is improved by the introduction of a set of tools: genetic operator switching mechanism via diversity thresholds, accuracy analysis tool and a new fitness evaluation mechanism. This three phase method is proposed to address the shortcoming that exists in the domain, where an initial solution is built only then to be completely dismantled and redesigned in the optimisation phase. In addition, a new routing heuristic, RouteAlg, is proposed to solve the VRPSDP sub-problem, the travelling salesman problem with simultaneous delivery and pickup (TSPSDP). The experimental studies are conducted using the well known benchmark Salhi and Nagy (1999) test problems, where the SDPmethod and RouteAlg solutions are compared with the prominent works in the VRPSDP domain. The SDPmethod has demonstrated to be an effective method for solving the multi-objective VRPSDP and the RouteAlg for the TSPSDP.
Resumo:
Despite recent research on time (e.g. Hedaa & Törnroos, 2001), consideration of the time dimension in data collection, analysis and interpretation in research in supply networks is, to date, still limited. Drawing on a body of literature from organization studies, and empirical findings from a six-year action research programme and a related study of network learning, we reflect on time, timing and timeliness in interorganizational networks. The empirical setting is supply networks in the English health sector wherein we identify and elaborate various issues of time, within the case and in terms of research process. Our analysis is wide-ranging and multi-level, from the global (e.g. identifying the notion of life cycles) to the particular (e.g. different cycle times in supply, such as daily for deliveries and yearly for contracts). We discuss the ‘speeding up’ of inter-organizational ‘e’ time and tensions with other time demands. In closing the paper, we relate our conclusions to the future conduct of the research programme and supply research more generally, and to the practice of managing supply (in) networks.
Resumo:
The Internet has become an integral part of our nation's critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a 'distance metric'. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.
Resumo:
Acknowledgements Funding: Chest, Heart and Stroke Scotland, grant ref. R13/A148. The funder had no role in study design, data collection, analysis and interpretation, writing of the manuscript, and in the decision to submit the manuscript for publication. All authors had full access to all the data in the study. The corresponding author had final responsibility for the decision to submit for publication.
Resumo:
Acknowledgements We thank all the participants who took part, the research fellows (Kate Taylor, Robert Caslake, David McGhee, Angus Macleod) and nurses (Clare Harris, Joanna Gordon, Anne Hayman, Hazel Forbes) who helped assess the participants, and the study secretaries (Susan Kilpatrick, Pam Rebecca) and data management team (Katie Wilde, David Ritchie). The PINE study was funded by the BMA Doris Hillier award, Parkinson's UK, the RS McDonald Trust, NHS Grampian Endowments, SPRING and the BUPA Foundation. None of the funders had any influence in the study design, the collection, analysis and interpretation of the data, the writing of the report or the decision to submit the article for publication.