862 resultados para Interactive Video Instruction: A Training Tool Whose Time Has Come


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research on production systems design has in recent years tended to concentrate on ‘software’ factors such as organisational aspects, work design, and the planning of the production operations. In contrast, relatively little attention has been paid to maximising the contributions made by fixed assets, particularly machines and equipment. However, as the cost of unproductive machine time has increased, reliability, particularly of machine tools, has become ever more important. Reliability theory and research has traditionally been based in the main on electrical and electronic equipment whereas mechanical devices, especially machine tools, have not received sufficiently objective treatment. A recently completed research project has considered the reliability of machine tools by taking sample surveys of purchasers, maintainers and manufacturers. Breakdown data were also collected from a number of engineering companies and analysed using both manual and computer techniques. Results obtained have provided an indication of those factors most likely to influence reliability and which in turn could lead to improved design and selection of machine tool systems. Statistical analysis of long-term field data has revealed patterns of trends of failure which could help in the design of more meaningful maintenance schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A great number of strategy tools are being taught in strategic management modules. These tools are available to managers for use in facilitating strategic decision-making and enhancing the strategy development process in their organisations. A number of studies have been published examining which are the most popular tools; however there is little empirical evidence on how their utilisation influences the strategy process. This paper is based on a large scale international survey on the strategy development process, and seeks to examine the impact of a particular strategy tool, the Balanced Scorecard, upon the strategy process. The Balanced Scorecard is one of the most popular strategy tools whose use has evolved since its introduction in the 1990’s. Recently, it has been suggested that as a strategy tool, Balanced Scorecard can influence all elements of the strategy process. The results of this study indicate that although there are significant differences in some elements of the strategy process between the organisations that have implemented the Balanced Scorecard and those that have not, the impact is not comprehensive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The initial objective of this work was to evaluate and introduce fabrication techniques based on W/0/W double emulsion and 0/W single emulsion systems with solvent evaporation for the incorporation of a surrogate macromolecule (BSA) into microspheres and microcapsules fabricated using P(HB-HV}, PEA and their blends. Biodegradation, expressed as changes in the gross and ultrastructural morphology of BSA loaded microparticulates with time was monitored using SEM concomitant with BSA release. Spherical microparticulates were successfully fabricated using both the W/0/W and 0/W emulsion systems. Both microspheres and microcapsules released BSA over a period of 24 to 26 days. BSA release from P(HB-HV)20% PCL 11 microcapsules increased steadily with time, while BSA release from all other microparticulates was characterised by an initial lag phase followed by exponential release lasting 6-11 days. Microcapsules were found to biodegrade more rapidly than microspheres fabricated from the same polymer. The incubation of microparticulates in newborn calf serum; synthetic gastric juice and pancreatin solution showed that microspheres and microcapsules were susceptible to enzymatic biodegradation. The in vitro incubation of microparticulates in Hank's buffer demonstrated limited biodegradation of microspheres and microcapsules by simple chemical hydrolysis. BSA release was thought to ocurr as a result of the macromolecule diffusing through either inherent micropores or via pores and channels generated in situ by previously dissolved BSA. However, in all cases, irrespective of percentage loading or fabrication polymer, low encapsulation efficiencies were obtained with W/0/W and 0/W techniques (4.2±0.9%- 15.5±0.5%,n=3), thus restricting the use of these techniques for the generation of microparticulate sustained drug delivery devices. In order to overcome this low encapsulation efficiency, a W/0 single emulsion technique was developed and evaluated in an attempt to minimise the loss of the macromolecule into the continuous aqueous phase and increase encapsulation efficiency. Poly(lactide-co-glycolide) [PLCG] 75:25 and 50:50, PEA alone and PEA blended with PLCG 50:50 to accelerate biodegradation, were used to microencapsulate the water soluble antibiotic vancomycin, a putative replacement for gentamicin in the control of bacterial infection in orthopaedic surgery especially during total hip replacement. Spherical microspheres (17.39±6.89~m,n=74-56.5±13.8~m,n=70) were successfully fabricated with vancomycin loadings of 10, 25 and 50%, regardless of the polymer blend used. All microspheres remained structurally intact over the period of vancomycin release and exhibited high percentage yields( 40. 75±2 .86%- 97.16±4.3%,n=3)and encapsulation efficiencies (47.75±9.0%- 96.74±13.2%,n=12). PLCG 75:25 microspheres with a vancomycin loading of 50% were judged to be the most useful since they had an encapsulation efficiency of 96.74+13.2%, n=12 and sustained therapeutically significant vancomycin release (15-25μg/ml) for up to 26 days. This work has provided the means for the fabrication of a spectrum of prototype biodegradable microparticulates, whose biodegradation has been characterised in physiological media and which have the potential for the sustained delivery of therapeutically useful macromolecules including water soluble antibiotics for orthopaedic applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chief pharmacists in 209 hospitals were surveyed about ADR reporting schemes, the priority given to ADR reporting, and attitudes towards ADR reporting. ADR reporting had a low managerial priority. Local reporting schemes were found to be operating in 37% trusts, but there were few plans to start new schemes. Few problems were discovered by the introduction of pharmacist ADR reporting. Chief pharmacists had concerns about the competence of hospital pharmacists to detect ADRs and were in favour of increased training. Lack of time on wards, and recruitment difficulties were suggested as reasons for hospital pharmacist under-reporting. Teaching hospitals appeared to have an increased interest in ADR reporting. A retrospective analysis of reporting trends within the West Midlands region from 1994, showed increasing or stable reporting rates for most sectors of reporters, except for general practitioners (GPs). The West Midlands region maintained higher ADR reporting rates than the rest of the UK. National reporting figures showed a worrying decline in ADR reports from healthcare professionals. Variation was found in the ADR reporting rates of Acute NHS Hospital Trusts and Primary Care Trusts (PCTs) in the West Midlands region, including correlations with prescribing rates and other PCT characteristics. Qualitative research into attitudes of GPs towards the Yellow Card scheme was undertaken. A series of qualitative interviews with GPs discovered barriers and positive motivators for their involvement in the Yellow Card scheme. A grounded theory of GP involvement in the Yellow Card scheme was developed to explain GP behaviour, and which could be used to inform potential solutions to halt declining rates of reporting. Under-reporting of ADRs continues to be a major concern to those who administer spontaneous reporting schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study explores the fascination which English culture represented in turn-of-the-century Vienna. The writers that are going to be discussed are the renowned Anglophile Hugo von Hofmannsthal, the more ambivalent Hermann Bahr and the idealizing, but Janus-faced Peter Altenberg. With the more widely known poet, prose writer and playwright, Hofmannsthal, individual aspects of his engagement with English culture have already been well researched; the same, however, cannot be said in the case of Hermann Bahr, whose extensive literary oeuvre has now largely been forgotten, and who has, instead, come to be valued as a prominent figure in the culture life of modernist Vienna, and Peter Altenberg, whose literary fame rests mainly on his prose poems and who, a legend in his life-time, has in recent years also increasingly attracted research interest as a phenomenon and ‘embodiment’ of the culture of his time: while their engagement with French literature, for example, has long received its due share of attention, their debt to English culture has, until now, been neglected. This thesis, therefore, sets out to explore Hofmannsthal’s, Bahr’s and Altenberg’s perception and portrayal of English civilization – ranging from English character and stereotypes, to what they saw as the principles of British society; it goes on to investigate the impulses they derive from Pre-Raphaelite art (Rossetti, Burne-Jones, Whistler) and the art and crafts-movement centred around William Morris, as well as their inspiration by the art criticism of John Ruskin and Walter Horatio Pater. In English literature one of the focal points will be their reading and evaluation of aestheticism as it was reflected in the life and writings of the Dubliner Oscar Wilde, who was perceived, by these Austrian authors, as a predominant figure of London’s cultural life. Similarly, they regarded his compatriot George Bernard Shaw as a key player in turn-of-the-century English (and European) culture. Hermann Bhar largely identified with him. Hofmannsthal, on the other hand, while having some reservations, acknowledged his importance and achievements, whereas Peter Altenberg saw in Shaw a model to reassure him, as his writings were becoming more openly didactic and even more miniaturistic than they had already been. He turned to Shaw, too, to explain and justify his new goal of making his texts more intelligent to a wider circle of readers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The humidity sensors constructed from polymer optical fiber Bragg gratings (POFBG) respond to the water content change in the fiber induced by varying environmental condition. The water content change is a diffusion process. Therefore the response time of the POFBG sensor strongly depends on the geometry and size of the fiber. In this work we investigate the use of laser micromachining of D-shaped and slotted structures to improve the response time of polymer fiber grating based humidity sensors. A significant improvement in the response time has been achieved in laser micromachined D-shaped POFBG humidity sensors. The slotted geometry allows water rapid access to the core region but this does not of itself improve response time due to the slow expansion of the bulk of the cladding. We show that by straining the slotted sensor, the expansion component can be removed resulting in the response time being determined only by the more rapid, water induced change in core refractive index. In this way the response time is reduced by a factor of 2.5.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter contributes to the anthology on learning to research - researching to learn because it emphases a need to design curricula that enables living research, and on-going researcher development, rather than one that restricts student and staff activities, within a marketised approach towards time. In recent decades higher education (HE) has come to be valued for its contribution to the global economy. Referred to as the neo-liberal university, a strong prioritisation has been placed on meeting the needs of industry by providing a better workforce. This perspective emphasises the role of a degree in HE to secure future material affluence, rather than to study as an on-going investment in the self (Molesworth , Nixon & Scullion, 2009: 280). Students are treated primarily as consumers in this model, where through their tuition fees they purchase a product, rather than benefit from the transformative potential university education offers for the whole of life.Given that HE is now measured by the numbers of students it attracts, and later places into well-paid jobs, there is an intense pressure on time, which has led to a method where the learning experiences of students are broken down into discrete modules. Whilst this provides consistency, students can come to view research processes in a fragmented way within the modular system. Topics are presented chronologically, week-by-week and students simply complete a set of tasks to ‘have a degree’, rather than to ‘be learners’ (Molesworth , Nixon & Scullion, 2009: 277) who are living their research, in relation to their own past, present and future. The idea of living research in this context is my own adaptation of an approach suggested by C. Wright Mills (1959) in The Sociological Imagination. Mills advises that successful scholars do not split their work from the rest of their lives, but treat scholarship as a choice of how to live, as well as a choice of career. The marketised slant in HE thus creates a tension firstly, for students who are learning to research. Mills would encourage them to be creative, not instrumental, in their use of time, yet they are journeying through a system that is structured for a swift progression towards a high paid job, rather than crafted for reflexive inquiry, that transforms their understanding throughout life. Many universities are placing a strong focus on discrete skills for student employability, but I suggest that embedding the transformative skills emphasised by Mills empowers students and builds their confidence to help them make connections that aid their employability. Secondly, the marketised approach creates a problem for staff designing the curriculum, if students do not easily make links across time over their years of study and whole programmes. By researching to learn, staff can discover new methods to apply in their design of the curriculum, to help students make important and creative connections across their programmes of study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Semantic Web has come a long way since its inception in 2001, especially in terms of technical development and research progress. However, adoption by non- technical practitioners is still an ongoing process, and in some areas this process is just now starting. Emergency response is an area where reliability and timeliness of information and technologies is of essence. Therefore it is quite natural that more widespread adoption in this area has not been seen until now, when Semantic Web technologies are mature enough to support the high requirements of the application area. Nevertheless, to leverage the full potential of Semantic Web research results for this application area, there is need for an arena where practitioners and researchers can meet and exchange ideas and results. Our intention is for this workshop, and hopefully coming workshops in the same series, to be such an arena for discussion. The Extended Semantic Web Conference (ESWC - formerly the European Semantic Web conference) is one of the major research conferences in the Semantic Web field, whereas this is a suitable location for this workshop in order to discuss the application of Semantic Web technology to our specific area of applications. Hence, we chose to arrange our first SMILE workshop at ESWC 2013. However, this workshop does not focus solely on semantic technologies for emergency response, but rather Semantic Web technologies in combination with technologies and principles for what is sometimes called the "social web". Social media has already been used successfully in many cases, as a tool for supporting emergency response. The aim of this workshop is therefore to take this to the next level and answer questions like: "how can we make sense of, and furthermore make use of, all the data that is produced by different kinds of social media platforms in an emergency situation?" For the first edition of this workshop the chairs collected the following main topics of interest: • Semantic Annotation for understanding the content and context of social media streams. • Integration of Social Media with Linked Data. • Interactive Interfaces and visual analytics methodologies for managing multiple large-scale, dynamic, evolving datasets. • Stream reasoning and event detection. • Social Data Mining. • Collaborative tools and services for Citizens, Organisations, Communities. • Privacy, ethics, trustworthiness and legal issues in the Social Semantic Web. • Use case analysis, with specific interest for use cases that involve the application of Social Media and Linked Data methodologies in real-life scenarios. All of these, applied in the context of: • Crisis and Disaster Management • Emergency Response • Security and Citizen Journalism The workshop received 6 high-quality paper submissions and based on a thorough review process, thanks to our program committee, the decision was made to accept four of these papers for the workshop (67% acceptance rate). These four papers can be found later in this proceedings volume. Three out of four of these papers particularly discuss the integration and analysis of social media data, using Semantic Web technologies, e.g. for detecting complex events in social media streams, for visualizing and analysing sentiments with respect to certain topics in social media, or for detecting small-scale incidents entirely through the use of social media information. Finally, the fourth paper presents an architecture for using Semantic Web technologies in resource management during a disaster. Additionally, the workshop featured an invited keynote speech by Dr. Tomi Kauppinen from Aalto university. Dr. Kauppinen shared experiences from his work on applying Semantic Web technologies to application fields such as geoinformatics and scientific research, i.e. so-called Linked Science, but also recent ideas and applications in the emergency response field. His input was also highly valuable for the roadmapping discussion, which was held at the end of the workshop. A separate summary of the roadmapping session can be found at the end of these proceedings. Finally, we would like to thank our invited speaker Dr. Tomi Kauppinen, all our program committee members, as well as the workshop chair of ESWC2013, Johanna Völker (University of Mannheim), for helping us to make this first SMILE workshop a highly interesting and successful event!

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The humidity sensor made of polymer optical fiber Bragg grating (POFBG) responds to the water content change in fiber induced by the change of environmental condition. The response time strongly depends on fiber size as the water change is a diffusion process. The ultra short laser pulses have been providing an effective micro fabrication method to achieve spatial localized modification in materials. In this work we used the excimer laser to create different microstructures (slot, D-shape) in POFBG to improve its performance. A significant improvement in the response time has been achieved in a laser etched D-shaped POFBG humidity sensor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A review of the literature reveals few research has attempted to demonstrate if a relationship exists between the type of teacher training a science teacher has received and the perceived attitudes of his/her students. Some of the teacher preparation factors examined in this study include the college major chosen by the science teacher, the highest degree earned, the number of years of teaching experience, the type of science course taught, and the grade level taught by the teacher. This study examined how the various factors mentioned, could influence the behaviors which are characteristic of the teacher, and how these behaviors could be reflective in the classroom environment experienced by the students.^ The instrument used in the study was the Classroom Environment Scale (CES), Real Form. The measured classroom environment was broken down into three separate dimensions, with three components within each dimension in the CES. Multiple Regression statistical analyses examined how components of the teachers' education influenced the perceived dimensions of the classroom environment from the students.^ The study occurred in Miami-Dade County Florida, with a predominantly urban high school student population. There were 40 secondary science teachers involved, each with an average of 30 students. The total number of students sampled in the study was 1200. The teachers who participated in the study taught the entire range of secondary science courses offered at this large school district. All teachers were selected by the researcher so that a balance would occur in the sample between teachers who were education major versus science major. Additionally, the researcher selected teachers so that a balance occurred in regards to the different levels of college degrees earned among those involved in the study.^ Several research questions sought to determine if there was significant difference between the type of the educational background obtained by secondary science teachers and the students' perception of the classroom environment. Other research questions sought to determine if there were significant differences in the students' perceptions of the classroom environment for secondary science teachers who taught biological content, or non-biological content sciences. An additional research question sought to evaluate if the grade level taught would affect the students' perception of the classroom environment. (Abstract shortened by UMI.) ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Next generation networks are characterized by ever increasing complexity, intelligence, heterogeneous technologies and increasing user expectations. Telecommunication networks in particular have become truly global, consisting of a variety of national and regional networks, both wired and wireless. Consequently, the management of telecommunication networks is becoming increasingly complex. In addition, network security and reliability requirements require additional overheads which increase the size of the data records. This in turn causes acute network traffic congestions. There is no single network management methodology to control the various requirements of today's networks, and provides a good level of Quality of Service (QoS), and network security. Therefore, an integrated approach is needed in which a combination of methodologies can provide solutions and answers to network events (which cause severe congestions and compromise the quality of service and security). The proposed solution focused on a systematic approach to design a network management system based upon the recent advances in the mobile agent technologies. This solution has provided a new traffic management system for telecommunication networks that is capable of (1) reducing the network traffic load (thus reducing traffic congestion), (2) overcoming existing network latency, (3) adapting dynamically to the traffic load of the system, (4) operating in heterogeneous environments with improved security, and (5) having robust and fault tolerance behavior. This solution has solved several key challenges in the development of network management for telecommunication networks using mobile agents. We have designed several types of agents, whose interactions will allow performing some complex management actions, and integrating them. Our solution is decentralized to eliminate excessive bandwidth usage and at the same time has extended the capabilities of the Simple Network Management Protocol (SNMP). Our solution is fully compatible with the existing standards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The safety of workers in nighttime roadway work zones has become a major concern for state transportation agencies due to the increase in the number of work zone fatalities. During the last decade, several studies have focused on the improvement of safety in nighttime roadway work zones; but the element that is still missing is a set of tools for translating the research results into practice. This paper discusses: 1) the importance of translating the research results related to the safety of workers and safety planning of nighttime work zones into practice, and 2) examples of tools that can be used for translating the results of such studies into practice. A tool that can propose safety recommendations in nighttime work zones and a web-based safety training tool for workers are presented in this paper. The tools were created as a component of a five-year research study on the assessment of the safety of nighttime roadway construction. The objectives of both tools are explained as well as their functionalities (i.e., what the tools can do for the users); their components (e.g., knowledge base, database, and interfaces); and their structures (i.e., how the components of the tools are organized to meet the objectives). Evaluations by the proposed users of each tool are also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The applications of micro-end-milling operations have increased recently. A Micro-End-Milling Operation Guide and Research Tool (MOGART) package has been developed for the study and monitoring of micro-end-milling operations. It includes an analytical cutting force model, neural network based data mapping and forecasting processes, and genetic algorithms based optimization routines. MOGART uses neural networks to estimate tool machinability and forecast tool wear from the experimental cutting force data, and genetic algorithms with the analytical model to monitor tool wear, breakage, run-out, cutting conditions from the cutting force profiles. ^ The performance of MOGART has been tested on the experimental data of over 800 experimental cases and very good agreement has been observed between the theoretical and experimental results. The MOGART package has been applied to the micro-end-milling operation study of Engineering Prototype Center of Radio Technology Division of Motorola Inc. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The applications of micro-end-milling operations have increased recently. A Micro-End-Milling Operation Guide and Research Tool (MOGART) package has been developed for the study and monitoring of micro-end-milling operations. It includes an analytical cutting force model, neural network based data mapping and forecasting processes, and genetic algorithms based optimization routines. MOGART uses neural networks to estimate tool machinability and forecast tool wear from the experimental cutting force data, and genetic algorithms with the analytical model to monitor tool wear, breakage, run-out, cutting conditions from the cutting force profiles. The performance of MOGART has been tested on the experimental data of over 800 experimental cases and very good agreement has been observed between the theoretical and experimental results. The MOGART package has been applied to the micro-end-milling operation study of Engineering Prototype Center of Radio Technology Division of Motorola Inc.