981 resultados para MiSeq paired-end technology
Resumo:
In this paper an attempt has been made to take a look at how the use of implant and electrode technology can now be employed to create biological brains for robots, to enable human enhancement and to diminish the effects of certain neural illnesses. In all cases the end result is to increase the range of abilities of the recipients. An indication is given of a number of areas in which such technology has already had a profound effect, a key element being the need for a clear interface linking the human brain directly with a computer. An overview of some of the latest developments in the field of Brain to Computer Interfacing is also given in order to assess advantages and disadvantages. The emphasis is clearly placed on practical studies that have been and are being undertaken and reported on, as opposed to those speculated, simulated or proposed as future projects. Related areas are discussed briefly only in the context of their contribution to the studies being undertaken. The area of focus is notably the use of invasive implant technology, where a connection is made directly with the cerebral cortex and/or nervous system. Tests and experimentation which do not involve human subjects are invariably carried out a priori to indicate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies from this area are discussed including our own involving neural growth. The paper goes on to describe human experimentation, in which neural implants have linked the human nervous system bi-directionally with technology and the internet. A view is taken as to the prospects for the future for this implantable computing in terms of both therapy and enhancement.
Resumo:
Objective: To describe the training undertaken by pharmacists employed in a pharmacist-led information technology-based intervention study to reduce medication errors in primary care (PINCER Trial), evaluate pharmacists’ assessment of the training, and the time implications of undertaking the training. Methods: Six pharmacists received training, which included training on root cause analysis and educational outreach, to enable them to deliver the PINCER Trial intervention. This was evaluated using self-report questionnaires at the end of each training session. The time taken to complete each session was recorded. Data from the evaluation forms were entered onto a Microsoft Excel spreadsheet, independently checked and the summary of results further verified. Frequencies were calculated for responses to the three-point Likert scale questions. Free-text comments from the evaluation forms and pharmacists’ diaries were analysed thematically. Key findings: All six pharmacists received 22 hours of training over five sessions. In four out of the five sessions, the pharmacists who completed an evaluation form (27 out of 30 were completed) stated they were satisfied or very satisfied with the various elements of the training package. Analysis of free-text comments and the pharmacists’ diaries showed that the principles of root cause analysis and educational outreach were viewed as useful tools to help pharmacists conduct pharmaceutical interventions in both the study and other pharmacy roles that they undertook. The opportunity to undertake role play was a valuable part of the training received. Conclusions: Findings presented in this paper suggest that providing the PINCER pharmacists with training in root cause analysis and educational outreach contributed to the successful delivery of PINCER interventions and could potentially be utilised by other pharmacists based in general practice to deliver pharmaceutical interventions to improve patient safety.
Resumo:
Objective To undertake a process evaluation of pharmacists' recommendations arising in the context of a complex IT-enabled pharmacist-delivered randomised controlled trial (PINCER trial) to reduce the risk of hazardous medicines management in general practices. Methods PINCER pharmacists manually recorded patients’ demographics, details of interventions recommended, actions undertaken by practice staff and time taken to manage individual cases of hazardous medicines management. Data were coded and double entered into SPSS v15, and then summarised using percentages for categorical data (with 95% CI) and, as appropriate, means (SD) or medians (IQR) for continuous data. Key findings Pharmacists spent a median of 20 minutes (IQR 10, 30) reviewing medical records, recommending interventions and completing actions in each case of hazardous medicines management. Pharmacists judged 72% (95%CI 70, 74) (1463/2026) of cases of hazardous medicines management to be clinically relevant. Pharmacists recommended 2105 interventions in 74% (95%CI 73, 76) (1516/2038) of cases and 1685 actions were taken in 61% (95%CI 59, 63) (1246/2038) of cases; 66% (95%CI 64, 68) (1383/2105) of interventions recommended by pharmacists were completed and 5% (95%CI 4, 6) (104/2105) of recommendations were accepted by general practitioners (GPs), but not completed at the end of the pharmacists’ placement; the remaining recommendations were rejected or considered not relevant by GPs. Conclusions The outcome measures were used to target pharmacist activity in general practice towards patients at risk from hazardous medicines management. Recommendations from trained PINCER pharmacists were found to be broadly acceptable to GPs and led to ameliorative action in the majority of cases. It seems likely that the approach used by the PINCER pharmacists could be employed by other practice pharmacists following appropriate training.
Resumo:
The use of information and communication technologies (ICT) for transforming the way publicservices are delivered, has been an area of investment and focus in many countries in recentyears. The UK government envisioned moving from e-Government to transformationalgovernment by 2008, and initiatives such as the National Programme for IT (NPfIT) wereunderway towards this end. NPfIT was the largest civil IT programme worldwide at an initialestimated cost of £12.4bn over a ten-year period. It was launched in 2002 by the UKgovernment as part of its policy to transform the English NHS and to implement standardised ITsolutions at a national level. However, this top down, government led approach came underincreasing scrutiny, and is now being reconfigured towards a more decentralised mode of operations. This paper looks into the implementation of NPfIT and analyses the reasons behindits failure, and what effect the new NHS reforms are likely to have on the health sector. Wedraw from past studies (Weill and Ross, 2005) to highlight the key areas of concern in ITgovernance, using the NPfIT as an illustration
Resumo:
European air quality legislation has reduced emissions of air pollutants across Europe since the 1970s, affecting air quality, human health and regional climate. We used a coupled composition-climate model to simulate the impacts of European air quality legislation and technology measures implemented between 1970 and 2010. We contrast simulations using two emission scenarios; one with actual emissions in 2010 and the other with emissions that would have occurred in 2010 in the absence of technological improvements and end-of-pipe treatment measures in the energy, industrial and road transport sectors. European emissions of sulphur dioxide, black carbon (BC) and organic carbon in 2010 are 53%, 59% and 32% lower respectively compared to emissions that would have occurred in 2010 in the absence of legislative and technology measures. These emission reductions decreased simulated European annual mean concentrations of fine particulate matter(PM2.5) by 35%, sulphate by 44%, BC by 56% and particulate organic matter by 23%. The reduction in PM2.5 concentrations is calculated to have prevented 80 000 (37 000–116 000, at 95% confidence intervals) premature deaths annually across the European Union, resulting in a perceived financial benefit to society of US$232 billion annually (1.4% of 2010 EU GDP). The reduction in aerosol concentrations due to legislative and technology measures caused a positive change in the aerosol radiative effect at the top of atmosphere, reduced atmospheric absorption and also increased the amount of solar radiation incident at the surface over Europe. We used an energy budget approximation to estimate that these changes in the radiative balance have increased European annual mean surface temperatures and precipitation by 0.45 ± 0.11 °C and by 13 ± 0.8 mm yr−1 respectively. Our results show that the implementation of European legislation and technological improvements to reduce the emission of air pollutants has improved air quality and human health over Europe, as well as having an unintended impact on the regional radiative balance and climate.
Resumo:
The utilization of protein hydrolysates in food systems is frequently hindered due to their bitterness and hygroscopicity. Spray drying technology could be an alternative for reducing these problems. The aim of this work was to reduce or to mask the casein hydrolysate bitter taste using spray drying and mixtures of gelatin and soy protein isolate (SPI) as carriers. Six formulations were studied: three with 20% of hydrolysate and 80% of mixture (gelatine/SPI at proportions of 50/50, 40/60 and 60/40%) and three with 30% of hydrolysate and 70% of mixture (gelatine/SPI at proportions of 50/50, 40/60 and 60/40%). The spray-dried formulations were evaluated by SEM, hygroscopicity, thermal behavior (DSC), dissolution, and bitter taste, by a trained sensory panel using a paired-comparison test (free samples vs. spray-dried samples); all samples were presented in powder form. SEM analysis showed mostly spherically shaped particles, with many concavities and some particles with pores. All formulations were oil and water compatible and showed lower hygroscopicity values than free casein hydrolysate. At Aw 0.83, the free hydrolysate showed Tg about 25 degrees C lower than the formulations, indicating that the formulations may be more stable at Aw >= 0.65 since the glass transition should be prevented. The sensory panel found the formulations, tasted in the powder form, to be less bitter (P < 0.05) than the free casein hydrolysate. These results indicated that spray drying of casein hydrolysate with mixtures of gelatin and SPI was successful to attenuate the bitterness of casein hydrolysate. Thus, spray drying widens the possibilities of application of casein hydrolysates. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
This research is based on consumer complaints with respect to recently purchased consumer electronics. This research document will investigate the instances of development and device management as a tool used to aid consumer and manage consumer’s mobile products in order to resolve issues in or before the consumers is aware one exists. The problem at the present time is that mobile devices are becoming very advanced pieces of technology, and not all manufacturers and network providers have kept up the support element of End users. As such, the subject of the research is to investigate how device management could possibly be used as a method to promote research and development of mobile devices, and provide a better experience for the consumer. The wireless world is becoming increasingly complex as revenue opportunities are driven by new and innovative data services. We can no longer expect the customer to have the knowledge or ability to configure their own device. Device Management platforms can address the challenges of device configuration and support through new enabling technologies. Leveraging these technologies will allow a network operator to reduce the cost of subscriber ownership, drive increased ARPU (Average Revenue per User) by removing barriers to adoption, reduce churn by improving the customer experience and increase customer loyalty. DM technologies provide a flexible and powerful management method but are managing the same device features that have historically been configured manually through call centers or by the end user making changes directly on the device. For this reason DM technologies must be treated as part of a wider support solution. The traditional requirement for discovery, fault finding, troubleshooting and diagnosis are still as relevant with DM as they are in the current human support environment yet the current generation of solutions do little to address this problem. In the deployment of an effective Device Management solution the network operator must consider the integration of the DM platform, interfacing with many areas of the business, supported by knowledge of the relationship between devices, applications, solutions and services maintained on an ongoing basis. Complementing the DM solution with published device information, setup guides, training material and web based tools will ensure the quality of the customer experience, ensuring that problems are completely resolved, driving data usage by focusing customer education on the use of the wireless service In this way device management becomes a tool used both internally within the network or device vendor and by the customer themselves, with each user empowered to effectively manage the device without any prior knowledge or experience, confident that changes they apply will be relevant, accurate, stable and compatible. The value offered by an effective DM solution with an expert knowledge service will become a significant differentiator for the network operator in an ever competitive wireless market. This research document is intended to highlight some of the issues the industry faces as device management technologies become more prevalent, and offers some potential solutions to simplify the increasingly complex task of managing devices on the network, where device management can be used as a tool to aid customer relations and manage customer’s mobile products in order to resolve issues before the user is aware one exists. The research is broken down into the following, Customer Relationship Management, Device management, the role of knowledge with the DM, Companies that have successfully implemented device management, and the future of device management and CRM. And it also consists of questionnaires aimed at technical support agents and mobile device users. Interview was carried out with CRM managers within support centre to further the evidence gathered. To conclude, the document is to consider the advantages and disadvantages of device management and attempt to determine the influence it will have over customer support centre, and what methods could be used to implement it.
Resumo:
This research aimed to find out which are the main factors that lead technology startups to fail. The study focused on companies located in the Southeast region of Brazil that operated between 2009 and 2014. In the beginning, a review of the literature was done to have a better understanding of basic concepts of entrepreneurship as well as modern techniques for developing entrepreneurship. Furthermore, an analysis of the entrepreneurial scenario in Brazil, with a focus on the Southeast, was also done. After this phase, the qualitative study began, in which 24 specialists from startups were interviewed and asked about which factors were crucial in leading a technology startup to fail. After analyzing the results, four main factors were identified and these factors were validated through a quantitative survey. A questionnaire was then formulated based on the answers from the respondents and distributed to founders and executives of startups, which both failed and succeeded. The questionnaire was answered by 56 companies and their answers were treated with the factor analysis statistical method to check the validity of the questionnaire. Finally, the logistical regression method was used to know the extent to which the factors led to the startups’ failure. In the end, the results obtained suggest that the most significant factor that leads technology startups in southeastern Brazil to fail are problems with interpersonal relationship between partners or investors.
Resumo:
The main objective of this Thesis is to analyze Customer Intimacy Strategy in B2B technology businesses in Colombia and the variables that have a direct relationship with it like perception, trust and networking. And how a Customer Intimacy Strategy can affect a company to achieve positive or negative results in an operation, in terms of business opportunities, relations and profitable and sustainable sales if properly managed or mismanaged. With a population of almost 50 million people, GDP average growth of 4.22%(considering 2013 up to 2017), a strategic geographic location in Latin America close to the middle of the region with direct access to the Pacific and Atlantic oceans, on the verge to reach a peace agreement ending its long time social and security conflict with the local guerrillas, Colombia is a country with a stable economic present and promising future. But despite the appealing business landscape and opportunities both in number and size, it is a developing economy where firms who are willing to run a startup or who currently have B2B technology operations in this country will find out that uncertainty and mistrust are two of the most critical variables that need to be overcome in order to achieve success. Their relevance will vary from one region to another, but will still be considered of most importance throughout the country. This matter is highly important to B2B technology businesses in Colombia because few firms are aware of the importance of customer intimacy strategy, believing that it is just a matter of social relationships and not considering the diverse number of variables such us perception, trust and networking that compose it. Customer intimacy strategy at the end becomes the main and most relevant source of sales in a B2B technology environment in Colombia.
Resumo:
Starting from the perspective of heterodox Keynesian-Minskyian-Kindlebergian financial economics, this paper begins by highlighting a number of mechanisms that contributed to the current financial crisis. These include excess liquidity, income polarisation, conflicts between financial and productive capital, lack of intelligent regulation, asymmetric information, principal-agent dilemmas and bounded rationalities. However, the paper then proceeds to argue that perhaps more than ever the ‘macroeconomics’ that led to this crisis only makes analytical sense if examined within the framework of the political settlements and distributional outcomes in which it had operated. Taking the perspective of critical social theories the paper concludes that, ultimately, the current financial crisis is the outcome of something much more systemic, namely an attempt to use neo-liberalism (or, in US terms, neo-conservatism) as a new technology of power to help transform capitalism into a rentiers’ delight. And in particular, into a system without much ‘compulsion’ on big business; i.e., one that imposes only minimal pressures on big agents to engage in competitive struggles in the real economy (while inflicting exactly the opposite fate on workers and small firms). A key component in the effectiveness of this new technology of power was its ability to transform the state into a major facilitator of the ever-increasing rent-seeking practices of oligopolistic capital. The architects of this experiment include some capitalist groups (in particular rentiers from the financial sector as well as capitalists from the ‘mature’ and most polluting industries of the preceding techno-economic paradigm), some political groups, as well as intellectual networks with their allies – including most economists and the ‘new’ left. Although rentiers did succeed in their attempt to get rid of practically all fetters on their greed, in the end the crisis materialised when ‘markets’ took their inevitable revenge on the rentiers by calling their (blatant) bluff.
Resumo:
The design of a Gilbert Cell Mixer and a low noise amplifier (LNA), using GaAs PHEMT technology is presented. The compatibility is shown for co-integration of both block on the same chip, to form a high performance 1.9 GHz receiver front end. The designed LNA shows 9.23 dB gain and 2.01 dB noise figure (NF). The mixer is designed to operate at RF=1.9 GHz, LO=2.0 GHz and IF=100 MHz with a gain of 14.3 dB and single sideband noise figure (SSB NF) of 9.6 dB. The mixer presents a bandwith of 8 GHz.
Resumo:
The present work shows an experimental and theoretical study on heat flow when end milling, at high-speed, hardened steels applied to moulds and dies. AISI H13 and AISI D2 steels were machined with two types of ball nose end mills: coated with (TiAl)N and tipped with PcBN. The workpiece geometry was designed to simulate tool-workpiece interaction in real situations found in mould industries, in which complex surfaces and thin walls are commonly machined. The compressed and cold air cooling systems were compared to dry machining Results indicated a relatively small temperature variation, with higher range when machining AISI D2 with PcBN-tipped end mill. All cooling systems used demonstrated good capacity to remove heat from the machined surface, especially the cold air. Compressed air was the most indicated to keep workpiece at relatively stable temperature. A theoretical model was also proposed to estimate the energy transferred to the workpiece (Q) and the average convection coefficient ((h) over bar) for the cooling systems used. The model used a FEM simulation and a steepest decent method to find the best values for both variables. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
The design of a Gilbert Cell Mixer and a low noise amplifier (LNA), using GaAs PHEMT technology is presented. The compatibility is shown for co-integration of both block on the same chip, to form a high performance 1.9 GHz receiver front-end. The designed LNA shows 9.23 dB gain and 2.01 dB noise figure (NF). The mixer is designed to operate at RF=1.9 GHz, LO=2.0 GHz and IF=100 MHz with a gain of 14.3 dB and single sideband noise figure (SSB NF) of 9.6 dB. The mixer presents a bandwith of 8 GHz.
Resumo:
Objective - To evaluate the effect of changing the mode of ventilation from spontaneous to controlled on the arterial-to-end-tidal CO2 difference [P(a-ET)CO2] and physiological dead space (VD(phys)/VT) in laterally and dorsally recumbent halothane-anesthetized horses. Study Design - Prospective, experimental, nonrandomized trial. Animals - Seven mixed breed adult horses (1 male and 6 female) weighing 320 ± 11 kg. Methods - Horses were anesthetized in 2 positions - right lateral and dorsal recumbency - with a minimum interval of 1 month. Anesthesia was maintained with halothane in oxygen for 180 minutes. Spontaneous ventilation (SV) was used for 90 minutes followed by 90 minutes of controlled ventilation (CV). The same ventilator settings were used for both laterally and dorsally recumbent horses. Arterial blood gas analysis was performed every 30 minutes during anesthesia. End-tidal CO2 (PETCO2) was measured continuously. P(a-ET)CO2 and VD(phys)/VT were calculated. Statistical analysis included analysis of variance for repeated measures over time, followed by Student-Newman-Keuls test. Comparison between groups was performed using a paired t test; P < .05 was considered significant. Results - P(a-ET)CO2 and VD(phys)/VT increased during SV, whereas CV reduced these variables. The variables did not change significantly throughout mechanical ventilation in either group. Dorsally recumbent horses showed greater P(a-ET)CO2 and VD(phys)/VT values throughout. PaCO2 was greater during CV in dorsally positioned horses. Conclusions and Clinical Relevance - Changing the mode of ventilation from spontaneous to controlled was effective in reducing P(a-ET)CO2 and physiological dead space in both laterally and dorsally recumbent halothane-anesthetized horses. Dorsal recumbency resulted in greater impairment of effective ventilation. Capnometry has a limited value for accurate estimation of PaCO, in anesthetized horses, although it may be used to evaluate pulmonary function when paired with arterial blood gas analysis. © Copyright 2000 by The American College of Veterinary Surgeons.
Resumo:
This document was adapted from a paper originally presented to the 8th Annual Caribbean Conference of Comprehensive Disaster Management, held in Montego Bay, Jamaica in December, 2013. It summarizes several activities that ECLAC has undertaken to assess the current state of information and communications technology (ICT) in the field of disaster risk management (DRM) as practiced in the Caribbean. These activities included an in-depth study that encompassed a survey of disaster management organizations in the region, an Expert Group Meeting attended by the heads of several national disaster offices, and a training workshop for professionals working in DRM in the Caribbean. One of the notable conclusions of ECLAC’s investigation on this topic is that the lack of human capacity is the single largest constraint that is faced in the implementation of ICT projects for DRM in the Caribbean. In considering strategies to address the challenge of limited human capacity at a regional level, two separate issues are recognized – the need to increase the ICT capabilities of disaster management professionals, and the need to make ICT specialists available to disaster management organizations to advise and assist in the implementation of technology-focused projects. To that end, two models are proposed to engage with this issue at a regional level. The first entails the establishment of a network of ICT trainers in the Caribbean to help DRM staff develop a strategic understanding of how technology can be used to further their organizational goals. The second is the development of “Centres of Excellence” for ICT in the Caribbean, which would enable the deployment of specialized ICT expertise to national disaster management offices on a project-by-project basis.