954 resultados para cooling chip for handheld electronic devices


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Suction-based wound healing devices with open-pore foam interfaces are widely used to treat complex tissue defects. The impact of changes in physicochemical parameters of the wound interfaces has not been investigated. METHODS: Full-thickness wounds in diabetic mice were treated with occlusive dressing or a suction device with a polyurethane foam interface varying in mean pore size diameter. Wound surface deformation on day 2 was measured on fixed tissues. Histologic cross-sections were analyzed for granulation tissue thickness (hematoxylin and eosin), myofibroblast density (α-smooth muscle actin), blood vessel density (platelet endothelial cell adhesion molecule-1), and cell proliferation (Ki67) on day 7. RESULTS: Polyurethane foam-induced wound surface deformation increased with polyurethane foam pore diameter: 15 percent (small pore size), 60 percent (medium pore size), and 150 percent (large pore size). The extent of wound strain correlated with granulation tissue thickness that increased 1.7-fold in small pore size foam-treated wounds, 2.5-fold in medium pore size foam-treated wounds, and 4.9-fold in large pore size foam-treated wounds (p < 0.05) compared with wounds treated with an occlusive dressing. All polyurethane foams increased the number of myofibroblasts over occlusive dressing, with maximal presence in large pore size foam-treated wounds compared with all other groups (p < 0.05). CONCLUSIONS: The pore size of the interface material of suction devices has a significant impact on the wound healing response. Larger pores increased wound surface strain, tissue growth, and transformation of contractile cells. Modification of the pore size is a powerful approach for meeting biological needs of specific wounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Medical implants, like cardiovascular devices, improve the quality of life for countless individuals but may become infected with bacteria like Staphylococcus aureus. Such infections take the form of a biofilm, a structured community of bacterial cells adherent to the surface of a solid substrate. Every biofilm begins with an attractive force or bond between bacterium and substratum. We used atomic force microscopy to probe experimentally forces between a fibronectin-coated surface (i.e., proxy for an implanted cardiac device) and fibronectin-binding receptors on the surface of individual living bacteria from each of 80 clinical isolates of S. aureus. These isolates originated from humans with infected cardiac devices (CDI; n = 26), uninfected cardiac devices (n = 20), and the anterior nares of asymptomatic subjects (n = 34). CDI isolates exhibited a distinct binding-force signature and had specific single amino acid polymorphisms in fibronectin-binding protein A corresponding to E652D, H782Q, and K786N. In silico molecular dynamics simulations demonstrate that residues D652, Q782, and N786 in fibronectin-binding protein A form extra hydrogen bonds with fibronectin, complementing the higher binding force and energy measured by atomic force microscopy for the CDI isolates. This study is significant, because it links pathogenic bacteria biofilms from the length scale of bonds acting across a nanometer-scale space to the clinical presentation of disease at the human dimension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: The purpose of this study was to analyze the debris captured in the distal protection filters used during carotid artery stenting (CAS). Background: CAS is an option available to high-risk patients requiring revascularization. Filters are suggested for optimal stroke prevention during CAS. Methods: From May 2005 to June 2007, filters from 59 asymptomatic patients who underwent CAS were collected and sent to a specialized laboratory for light-microscope and histological analysis. Peri- and postprocedural outcomes were assessed during 1-year follow-up. Results: On the basis of biomedical imaging of the filter debris, the captured material could not be identified as embolized particles from the carotid plaque. On histological analysis the debris consisted mainly of red blood cell aggregates and/ or platelets, occasionally accompanied by granulocytes. We found no consistent histological evidence of embolized particles originating from atherosclerotic plaques. Post-procedure, three neurological events were reported: two (3.4%) transient ischemic attacks (TIA) and one (1.7%) ipsilateral minor stroke. Conclusion: The filters used during CAS in asymptomatic patients planned for cardiac surgery often remained empty. These findings may be explained by assuming that asymptomatic patients feature a different atherosclerotic plaque composition or stabilization through antiplatelet medication. Larger, randomized trials are clearly warranted, especially in the asymptomatic population. © 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new asymptotic formula for the maximum static voltage in a simplified model for on-chip power distribution networks of array bonded integrated circuits. In this model the voltage is the solution of a Poisson equation in an infinite planar domain whose boundary is an array of circular pads of radius ", and we deal with the singular limit Ɛ → 0 case. In comparison with approximations that appear in the electronic engineering literature, our formula is more complete since we have obtained terms up to order Ɛ15. A procedure will be presented to compute all the successive terms, which can be interpreted as using multipole solutions of equations involving spatial derivatives of functions. To deduce the formula we use the method of matched asymptotic expansions. Our results are completely analytical and we make an extensive use of special functions and of the Gauss constant G

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aquest projecte inclou una aproximació als conceptes de RFID i targetes contactless centrant-se en l’ampliament usat MIFARE Classic chip. L’objectiu principal es mostrar el seu funcionament i les seves vulnerabilitats, així com alguns exemples pràctics fent una anàlisi de diferents serveis que les utilitzen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tehoelektoniikkalaitteella tarkoitetaan ohjaus- ja säätöjärjestelmää, jolla sähköä muokataan saatavilla olevasta muodosta haluttuun uuteen muotoon ja samalla hallitaan sähköisen tehon virtausta lähteestä käyttökohteeseen. Tämä siis eroaa signaalielektroniikasta, jossa sähköllä tyypillisesti siirretään tietoa hyödyntäen eri tiloja. Tehoelektroniikkalaitteita vertailtaessa katsotaan yleensä niiden luotettavuutta, kokoa, tehokkuutta, säätötarkkuutta ja tietysti hintaa. Tyypillisiä tehoelektroniikkalaitteita ovat taajuudenmuuttajat, UPS (Uninterruptible Power Supply) -laitteet, hitsauskoneet, induktiokuumentimet sekä erilaiset teholähteet. Perinteisesti näiden laitteiden ohjaus toteutetaan käyttäen mikroprosessoreja, ASIC- (Application Specific Integrated Circuit) tai IC (Intergrated Circuit) -piirejä sekä analogisia säätimiä. Tässä tutkimuksessa on analysoitu FPGA (Field Programmable Gate Array) -piirien soveltuvuutta tehoelektroniikan ohjaukseen. FPGA-piirien rakenne muodostuu erilaisista loogisista elementeistä ja niiden välisistä yhdysjohdoista.Loogiset elementit ovat porttipiirejä ja kiikkuja. Yhdysjohdot ja loogiset elementit ovat piirissä kiinteitä eikä koostumusta tai lukumäärää voi jälkikäteen muuttaa. Ohjelmoitavuus syntyy elementtien välisistä liitännöistä. Piirissä on lukuisia, jopa miljoonia kytkimiä, joiden asento voidaan asettaa. Siten piirin peruselementeistä voidaan muodostaa lukematon määrä erilaisia toiminnallisia kokonaisuuksia. FPGA-piirejä on pitkään käytetty kommunikointialan tuotteissa ja siksi niiden kehitys on viime vuosina ollut nopeaa. Samalla hinnat ovat pudonneet. Tästä johtuen FPGA-piiristä on tullut kiinnostava vaihtoehto myös tehoelektroniikkalaitteiden ohjaukseen. Väitöstyössä FPGA-piirien käytön soveltuvuutta on tutkittu käyttäen kahta vaativaa ja erilaista käytännön tehoelektroniikkalaitetta: taajuudenmuuttajaa ja hitsauskonetta. Molempiin testikohteisiin rakennettiin alan suomalaisten teollisuusyritysten kanssa soveltuvat prototyypit,joiden ohjauselektroniikka muutettiin FPGA-pohjaiseksi. Lisäksi kehitettiin tätä uutta tekniikkaa hyödyntävät uudentyyppiset ohjausmenetelmät. Prototyyppien toimivuutta verrattiin vastaaviin perinteisillä menetelmillä ohjattuihin kaupallisiin tuotteisiin ja havaittiin FPGA-piirien mahdollistaman rinnakkaisen laskennantuomat edut molempien tehoelektroniikkalaitteiden toimivuudessa. Työssä on myösesitetty uusia menetelmiä ja työkaluja FPGA-pohjaisen säätöjärjestelmän kehitykseen ja testaukseen. Esitetyillä menetelmillä tuotteiden kehitys saadaan mahdollisimman nopeaksi ja tehokkaaksi. Lisäksi työssä on kehitetty FPGA:n sisäinen ohjaus- ja kommunikointiväylärakenne, joka palvelee tehoelektroniikkalaitteiden ohjaussovelluksia. Uusi kommunikointirakenne edistää lisäksi jo tehtyjen osajärjestelmien uudelleen käytettävyyttä tulevissa sovelluksissa ja tuotesukupolvissa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes the creation of a bioinspired electronic white cane for blind people using the whiskers principle for short-range navigation and exploration. Whiskers are coarse hairs of an animal's face that tells the animal that it has touched something using the nerves of the skin. In this work the raw data acquired from a low-size terrestrial LIDAR and a tri-axial accelerometer is converted into tactile information using several electromagnetic devices configured as a tactile belt. The LIDAR and the accelerometer are attached to the user’s forearm and connected with a wire to the control unit placed on the belt. Early validation experiments carried out in the laboratory are promising in terms of usability and description of the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Therapeutic hypothermia following hypoxic ischaemic encephalopathy in term infants was introduced into Switzerland in 2005. Initial documentation of perinatal and resuscitation details was poor and neuromonitoring insufficient. In 2011, a National Asphyxia and Cooling Register was introduced. AIMS: To compare management of cooled infants before and after introduction of the register concerning documentation, neuromonitoring, cooling methods and evaluation of temperature variability between cooling methods. STUDY DESIGN: Data of cooled infants before the register was in place (first time period: 2005-2010) and afterwards (second time period: 2011-2012) was collected with a case report form. RESULTS: 150 infants were cooled during the first time period and 97 during the second time period. Most infants were cooled passively or passively with gel packs during both time periods (82% in 2005-2010 vs 70% in 2011-2012), however more infants were cooled actively during the second time period (18% versus 30%). Overall there was a significant reduction in temperature variability (p < 0.001) comparing the two time periods. A significantly higher proportion of temperature measurements within target temperature range (72% versus 77%, p < 0.001), fewer temperature measurements above (24% versus 7%, p < 0.001) and more temperatures below target range (4% versus 16%, p < 0.001) were recorded during the second time period. Neuromonitoring improved after introduction of the cooling register. CONCLUSION: Management of infants with HIE improved since introducing the register. Temperature variability was reduced, more temperature measurements in the target range and fewer temperature measurements above target range were observed. Neuromonitoring has improved, however imaging should be performed more often.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrogenated nanocrystalline silicon (nc-Si:H) obtained by hot-wire chemical vapour deposition (HWCVD) at low substrate temperature (150 °C) has been incorporated as the active layer in bottom-gate thin-film transistors (TFTs). These devices were electrically characterised by measuring in vacuum the output and transfer characteristics for different temperatures. The field-effect mobility showed a thermally activated behaviour which could be attributed to carrier trapping at the band tails, as in hydrogenated amorphous silicon (a-Si:H), and potential barriers for the electronic transport. Trapped charge at the interfaces of the columns, which are typical in nc-Si:H, would account for these barriers. By using the Levinson technique, the quality of the material at the column boundaries could be studied. Finally, these results were interpreted according to the particular microstructure of nc-Si:H.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La cigarette électronique produit de la vapeur à inhaler contenant du propylène-glycol, des arômes et de la nicotine libérée rapidement. 6,7% de la population suisse, surtout des fumeurs, ont essayé la cigarette électronique et 0,1% l'utilise quotidiennement. Malgré l'incertitude due au bas niveau de preuves, la cigarette électronique pourrait être efficace pour cesser ou réduire le tabagisme. La sécurité de la cigarette électronique est démontrée à court terme mais pas à long terme ; sa toxicité semble très inférieure à celle du tabac. Les non-fumeurs et les jeunes utilisent peu la cigarette électronique qui ne semble pas les amener au tabagisme. Les mesures de santé publique recommandées sont la régulation du produit avec contrôle de la qualité ainsi que l'interdiction d'usage dans les lieux publics, de publicité et de vente aux mineurs. Electronic cigarettes are devices producing vapour containing propylene-glycol, flavourings and quickly delivered nicotine. 6.7% of the Swiss population, mainly smokers, experimented the electronic cigarette while 0.1% use it daily. Despite uncertainty due to the low level of evidence, electronic cigarettes might be effective for smoking cessation and reduction. The safety of electronic cigarettes is demonstrated at short-term but not at long-term; however its eventual toxicity is likely to be much lower than tobacco. Use of electronic cigarettes by non-smokers and youth who do not smoke is low and seems unlikely to lead them to tobacco use. Recommended public health measures include product regulation with quality control, ban in public places, prohibition of advertising and sales to minors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Living bacteria or yeast cells are frequently used as bioreporters for the detection of specific chemical analytes or conditions of sample toxicity. In particular, bacteria or yeast equipped with synthetic gene circuitry that allows the production of a reliable non-cognate signal (e.g., fluorescent protein or bioluminescence) in response to a defined target make robust and flexible analytical platforms. We report here how bacterial cells expressing a fluorescence reporter ("bactosensors"), which are mostly used for batch sample analysis, can be deployed for automated semi-continuous target analysis in a single concise biochip. Escherichia coli-based bactosensor cells were continuously grown in a 13 or 50 nanoliter-volume reactor on a two-layered polydimethylsiloxane-on-glass microfluidic chip. Physiologically active cells were directed from the nl-reactor to a dedicated sample exposure area, where they were concentrated and reacted in 40 minutes with the target chemical by localized emission of the fluorescent reporter signal. We demonstrate the functioning of the bactosensor-chip by the automated detection of 50 μgarsenite-As l(-1) in water on consecutive days and after a one-week constant operation. Best induction of the bactosensors of 6-9-fold to 50 μg l(-1) was found at an apparent dilution rate of 0.12 h(-1) in the 50 nl microreactor. The bactosensor chip principle could be widely applicable to construct automated monitoring devices for a variety of targets in different environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technology scaling has proceeded into dimensions in which the reliability of manufactured devices is becoming endangered. The reliability decrease is a consequence of physical limitations, relative increase of variations, and decreasing noise margins, among others. A promising solution for bringing the reliability of circuits back to a desired level is the use of design methods which introduce tolerance against possible faults in an integrated circuit. This thesis studies and presents fault tolerance methods for network-onchip (NoC) which is a design paradigm targeted for very large systems-onchip. In a NoC resources, such as processors and memories, are connected to a communication network; comparable to the Internet. Fault tolerance in such a system can be achieved at many abstraction levels. The thesis studies the origin of faults in modern technologies and explains the classification to transient, intermittent and permanent faults. A survey of fault tolerance methods is presented to demonstrate the diversity of available methods. Networks-on-chip are approached by exploring their main design choices: the selection of a topology, routing protocol, and flow control method. Fault tolerance methods for NoCs are studied at different layers of the OSI reference model. The data link layer provides a reliable communication link over a physical channel. Error control coding is an efficient fault tolerance method especially against transient faults at this abstraction level. Error control coding methods suitable for on-chip communication are studied and their implementations presented. Error control coding loses its effectiveness in the presence of intermittent and permanent faults. Therefore, other solutions against them are presented. The introduction of spare wires and split transmissions are shown to provide good tolerance against intermittent and permanent errors and their combination to error control coding is illustrated. At the network layer positioned above the data link layer, fault tolerance can be achieved with the design of fault tolerant network topologies and routing algorithms. Both of these approaches are presented in the thesis together with realizations in the both categories. The thesis concludes that an optimal fault tolerance solution contains carefully co-designed elements from different abstraction levels