862 resultados para machine tools and accessories


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This book argues for novel strategies to integrate engineering design procedures and structural analysis data into architectural design. Algorithmic procedures that recently migrated into the architectural practice are utilized to improve the interface of both disciplines. Architectural design is predominately conducted as a negotiation process of various factors but often lacks rigor and data structures to link it to quantitative procedures. Numerical structural design on the other hand could act as a role model for handling data and robust optimization but it often lacks the complexity of architectural design. The goal of this research is to bring together robust methods from structural design and complex dependency networks from architectural design processes. The book presents three case studies of tools and methods that are developed to exemplify, analyze and evaluate a collaborative work flow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the development of hardware, theory, and experimental methods to enable a robotic manipulator arm to interact with soils and estimate soil properties from interaction forces. Unlike the majority of robotic systems interacting with soil, our objective is parameter estimation, not excavation. To this end, we design our manipulator with a flat plate for easy modeling of interactions. By using a flat plate, we take advantage of the wealth of research on the similar problem of earth pressure on retaining walls. There are a number of existing earth pressure models. These models typically provide estimates of force which are in uncertain relation to the true force. A recent technique, known as numerical limit analysis, provides upper and lower bounds on the true force. Predictions from the numerical limit analysis technique are shown to be in good agreement with other accepted models. Experimental methods for plate insertion, soil-tool interface friction estimation, and control of applied forces on the soil are presented. In addition, a novel graphical technique for inverting the soil models is developed, which is an improvement over standard nonlinear optimization. This graphical technique utilizes the uncertainties associated with each set of force measurements to obtain all possible parameters which could have produced the measured forces. The system is tested on three cohesionless soils, two in a loose state and one in a loose and dense state. The results are compared with friction angles obtained from direct shear tests. The results highlight a number of key points. Common assumptions are made in soil modeling. Most notably, the Mohr-Coulomb failure law and perfectly plastic behavior. In the direct shear tests, a marked dependence of friction angle on the normal stress at low stresses is found. This has ramifications for any study of friction done at low stresses. In addition, gradual failures are often observed for vertical tools and tools inclined away from the direction of motion. After accounting for the change in friction angle at low stresses, the results show good agreement with the direct shear values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, we've focussed on the question of how to make a system easy to code the first time, or perhaps on how to ease the system's continued evolution. But if we look at life cycle costs, then we must conclude that the important question is how to make a system easy to operate. To do this we need to make it easy for the operators to see what's going on and to then manipulate the system so that it does what it is supposed to. This is a radically different criterion for success. What makes a computer system visible and controllable? This is a difficult question, but it's clear that today's modern operating systems with nearly 50 million source lines of code are neither. Strikingly, the MIT Lisp Machine and its commercial successors provided almost the same functionality as today's mainstream sytsems, but with only 1 Million lines of code. This paper is a retrospective examination of the features of the Lisp Machine hardware and software system. Our key claim is that by building the Object Abstraction into the lowest tiers of the system, great synergy and clarity were obtained. It is our hope that this is a lesson that can impact tomorrow's designs. We also speculate on how the spirit of the Lisp Machine could be extended to include a comprehensive access control model and how new layers of abstraction could further enrich this model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores the concept of Value Stream Analysis and Mapping (VSA/M) as applied to Product Development (PD) efforts. Value Stream Analysis and Mapping is a method of business process improvement. The application of VSA/M began in the manufacturing community. PD efforts provide a different setting for the use of VSA/M. Site visits were made to nine major U.S. aerospace organizations. Interviews, discussions, and participatory events were used to gather data on (1) the sophistication of the tools used in PD process improvement efforts, (2) the lean context of the use of the tools, and (3) success of the efforts. It was found that all three factors were strongly correlated, suggesting success depends on both good tools and lean context. Finally, a general VSA/M method for PD activities is proposed. The method uses modified process mapping tools to analyze and improve process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores the concept of Value Stream Analysis and Mapping (VSA/M) as applied to Product Development (PD) efforts. Value Stream Analysis and Mapping is a method of business process improvement. The application of VSA/M began in the manufacturing community. PD efforts provide a different setting for the use of VSA/M. Site visits were made to nine major U.S. aerospace organizations. Interviews, discussions, and participatory events were used to gather data on (1) the sophistication of the tools used in PD process improvement efforts, (2) the lean context of the use of the tools, and (3) success of the efforts. It was found that all three factors were strongly correlated, suggesting success depends on both good tools and lean context. Finally, a general VSA/M method for PD activities is proposed. The method uses modified process mapping tools to analyze and improve process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How we get from transistors through to logic gates to ALUs and memory to the stored program and the fetch execute cycle through to machine code and high level languages. Inspired by Tanenbaum's approach in "Structured Computer Organozation"

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software Engineering Team project introductory lecture and project start up for 2014-2015. it covers team working, infrastructure tools, and an outline of the agile methods, practices and principles that will be used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this session we'll explore how Microsoft uses data science and machine learning across it's entire business, from Windows and Office, to Skype and XBox. We'll look at how companies across the world use Microsoft technology for empowering their businesses in many different industries. And we'll look at data science technologies you can use yourselves, such as Azure Machine Learning and Power BI. Finally we'll discuss job opportunities for data scientists and tips on how you can be successful!

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An emerging consensus in cognitive science views the biological brain as a hierarchically-organized predictive processing system. This is a system in which higher-order regions are continuously attempting to predict the activity of lower-order regions at a variety of (increasingly abstract) spatial and temporal scales. The brain is thus revealed as a hierarchical prediction machine that is constantly engaged in the effort to predict the flow of information originating from the sensory surfaces. Such a view seems to afford a great deal of explanatory leverage when it comes to a broad swathe of seemingly disparate psychological phenomena (e.g., learning, memory, perception, action, emotion, planning, reason, imagination, and conscious experience). In the most positive case, the predictive processing story seems to provide our first glimpse at what a unified (computationally-tractable and neurobiological plausible) account of human psychology might look like. This obviously marks out one reason why such models should be the focus of current empirical and theoretical attention. Another reason, however, is rooted in the potential of such models to advance the current state-of-the-art in machine intelligence and machine learning. Interestingly, the vision of the brain as a hierarchical prediction machine is one that establishes contact with work that goes under the heading of 'deep learning'. Deep learning systems thus often attempt to make use of predictive processing schemes and (increasingly abstract) generative models as a means of supporting the analysis of large data sets. But are such computational systems sufficient (by themselves) to provide a route to general human-level analytic capabilities? I will argue that they are not and that closer attention to a broader range of forces and factors (many of which are not confined to the neural realm) may be required to understand what it is that gives human cognition its distinctive (and largely unique) flavour. The vision that emerges is one of 'homomimetic deep learning systems', systems that situate a hierarchically-organized predictive processing core within a larger nexus of developmental, behavioural, symbolic, technological and social influences. Relative to that vision, I suggest that we should see the Web as a form of 'cognitive ecology', one that is as much involved with the transformation of machine intelligence as it is with the progressive reshaping of our own cognitive capabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The tight junction (TJ) is one of the most important structures established during merozoite invasion of host cells and a large amount of proteins stored in Toxoplasma and Plasmodium parasites’ apical organelles are involved in forming the TJ. Plasmodium falciparum and Toxoplasma gondii apical membrane antigen 1 (AMA-1) and rhoptry neck proteins (RONs) are the two main TJ components. It has been shown that RON4 plays an essential role during merozoite and sporozoite invasion to target cells. This study has focused on characterizing a novel Plasmodium vivax rhoptry protein, RON4, which is homologous to PfRON4 and PkRON4. Methods: The ron4 gene was re-annotated in the P. vivax genome using various bioinformatics tools and taking PfRON4 and PkRON4 amino acid sequences as templates. Gene synteny, as well as identity and similarity values between open reading frames (ORFs) belonging to the three species were assessed. The gene transcription of pvron4, and the expression and localization of the encoded protein were also determined in the VCG-1 strain by molecular and immunological studies. Nucleotide and amino acid sequences obtained for pvron4 in VCG-1 were compared to those from strains coming from different geographical areas. Results: PvRON4 is a 733 amino acid long protein, which is encoded by three exons, having similar transcription and translation patterns to those reported for its homologue, PfRON4. Sequencing PvRON4 from the VCG-1 strain and comparing it to P. vivax strains from different geographical locations has shown two conserved regions separated by a low complexity variable region, possibly acting as a “smokescreen”. PvRON4 contains a predicted signal sequence, a coiled-coil α-helical motif, two tandem repeats and six conserved cysteines towards the carboxyterminus and is a soluble protein lacking predicted transmembranal domains or a GPI anchor. Indirect immunofluorescence assays have shown that PvRON4 is expressed at the apical end of schizonts and co-localizes at the rhoptry neck with PvRON2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Els processos de mecanitzat amb arrencament de ferritja consisteixen en arrencar contínuament petits fragments de material de la peça a la que es vol donar forma mitjançant les eines de tall. En treballar amb metalls, aquestes forces de tall són importants, i provoquen el desgast de les eines així com un escalfament important de l’eina i la peça. Per aquest motiu s’introdueixen els olis de tall, amb les principals funcions de refrigerar i lubricar. Tradicionalment, aquests olis de tall s’han subministrat a raig, quedant la zona de treball inundada. Els últims anys, degut a les millores tecnològiques i per intentar estalviar amb olis de tall, s’ha ideat un nou sistema anomenat mínima quantitat de lubricant, el qual subministra petites gotes d’oli, no soluble en aigua, dins un flux d’aire a pressió. Aquest sistema, a més, s’injecta prop de la zona de treball, on és estricament necessari. Aquest sistema redueix considerablement el consum d’oli de tall, i a més, augmenta l’eficiència de la lubricació. A part, elimina els posteriors tractaments per reciclar l’oli de tall un cop perd les seves propietats. Per aquest motiu, es pretén estudiar la utilització d’un mètode semblant al de la mínima quantitat de lubricant, subministrant el mateix oli de tall que s’utilitza convencionalment però polvoritzar per tal de convertir-lo en un aerosol

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human gut microbiota comprises a diverse microbial consortium closely co-evolved with the human genome and diet. The importance of the gut microbiota in regulating human health and disease has however been largely overlooked due to the inaccessibility of the intestinal habitat, the complexity of the gut microbiota itself and the fact that many of its members resist cultivation and are in fact new to science. However, with the emergence of 16S rRNA molecular tools and "post-genomics" high resolution technologies for examining microorganisms as they occur in nature without the need for prior laboratory culture, this limited view of the gut microbiota is rapidly changing. This review will discuss the application of molecular microbiological tools to study the human gut microbiota in a culture independent manner. Genomics or metagenomics approaches have a tremendous capability to generate compositional data and to measure the metabolic potential encoded by the combined genomes of the gut microbiota. Another post-genomics approach, metabonomics, has the capacity to measure the metabolic kinetic or flux of metabolites through an ecosystem at a particular point in time or over a time course. Metabonomics thus derives data on the function of the gut microbiota in situ and how it responds to different environmental stimuli e. g. substrates like prebiotics, antibiotics and other drugs and in response to disease. Recently these two culture independent, high resolution approaches have been combined into a single "transgenomic" approach which allows correlation of changes in metabolite profiles within human biofluids with microbiota compositional metagenomic data. Such approaches are providing novel insight into the composition, function and evolution of our gut microbiota.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Results from the first Sun-to-Earth coupled numerical model developed at the Center for Integrated Space Weather Modeling are presented. The model simulates physical processes occurring in space spanning from the corona of the Sun to the Earth's ionosphere, and it represents the first step toward creating a physics-based numerical tool for predicting space weather conditions in the near-Earth environment. Two 6- to 7-d intervals, representing different heliospheric conditions in terms of the three-dimensional configuration of the heliospheric current sheet, are chosen for simulations. These conditions lead to drastically different responses of the simulated magnetosphere-ionosphere system, emphasizing, on the one hand, challenges one encounters in building such forecasting tools, and on the other hand, emphasizing successes that can already be achieved even at this initial stage of Sun-to-Earth modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An increasing set of evidence has been reported on how consumers could potentially react to the introduction of genetically modified food. Studies typically contain some empirical evidence and some theoretical explanations of the data, however, to date limited effort has been posed on systematically reviewing the existing evidence and its implications for policy. This paper contributes to the literature by bringing together the published evidence on the behavioural frameworks and evidence on the process leading to the public acceptance of genetically modified (GM) food and organisms (GMOs). In doing so, we employ a set of clearly defined search tools and a limited number of comprehensive key words. The study attempts to gather an understanding of the published findings on the determinants of the valuation of GM food - both in terms of willingness to accept and the willing-to-pay a premium for non-GM food, trust with information sources on the safety and public health and ultimate attitudes underpinning such evidence. Furthermore, in the light of such evidence, we formulate some policy strategies to deal with public uncertainly regarding to GMOs and, especially GM food. (c) 2007 Elsevier Ltd. All rights reserved.