1000 resultados para machine independent


Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Fabry disease is an X-linked disorder resulting from alpha-galactosidase A deficiency. The cardiovascular findings include left ventricular hypertrophy (LVH) and increased intima-media thickness of the common carotid artery (CCA IMT). The current study examined the possible correlation between these parameters. To corroborate these clinical findings in vitro, plasma from Fabry patients was tested for possible proliferative effect on rat vascular smooth muscle cells (vascular smooth muscle cell [VSMC]) and mouse neonatal cardiomyocytes. METHODS AND RESULTS: Thirty male and 38 female patients were enrolled. LVH was found in 60% of men and 39% of women. Increased CCA IMT was equally present in males and females. There was a strong positive correlation between LV mass and CCA IMT (r2=0.27; P<0.0001). VSMC and neonatal cardiomyocyte proliferative response in vitro correlated with CCA IMT (r2=0.39; P<0.0004) and LV mass index (r2=0.19; P=0.028), respectively. CONCLUSIONS: LVH and CCA IMT occur concomitantly in Fabry suggesting common pathogenesis. The underlying cause may be a circulating growth-promoting factor whose presence has been confirmed in vitro.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Expression of co-inhibitory molecules is generally associated with T-cell dysfunction in chronic viral infections such as HIV or HCV. However, their relative contribution in the T-cell impairment remains unclear. In the present study, we have evaluated the impact of the expression of co-inhibitory molecules such as 2B4, PD-1 and CD160 on the functions of CD8 T-cells specific to influenza, EBV and CMV. We show that CD8 T-cell populations expressing CD160, but not PD-1, had reduced proliferation capacity and perforin expression, thus indicating that the functional impairment in CD160+ CD8 T cells may be independent of PD-1 expression. The blockade of CD160/CD160-ligand interaction restored CD8 T-cell proliferation capacity, and the extent of restoration directly correlated with the ex vivo proportion of CD160+ CD8 T cells suggesting that CD160 negatively regulates TCR-mediated signaling. Furthermore, CD160 expression was not up-regulated upon T-cell activation or proliferation as compared to PD-1. Taken together, these results provide evidence that CD160-associated CD8 T-cell functional impairment is independent of PD-1 expression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: The purpose of this study was to assess the effectiveness of a novel radiation-independent aiming device for distal locking of intramedullary nails in a human cadaver model. METHODS: A new targeting system was used in 25 intact human cadaver femora for the distal locking procedure after insertion of an intramedullary nail. The number of successful screw placements and the time needed for this locking procedure were recorded. The accuracy of the aiming process was evaluated by computed tomography. RESULTS: The duration of the distal locking process was 8.0 ± 1.8 minutes (mean ± SD; range, 4-11 minutes). None of the screw placements required fluoroscopic guidance. Computed tomography revealed high accuracy of the locking process. The incidence angle (α) of the locking screws through the distal locking holes of the nail was 86.8° ± 5.0° (mean ± SD; range, 80°-96°). Targeting failed in 1 static locking screw because of a material defect in the drilling sleeve. CONCLUSIONS: This cadaver study indicated that an aiming arm-based targeting device is highly reliable and accurate. The promising results suggest that it will help to decrease radiation exposure compared with the traditional "free-hand technique."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article we extend the rational partisan model of Alesina and Gatti (1995) to include a second policy, fiscal policy, besides monetary policy. It is shown that, with this extension, the politically induced variance of output is not always eliminated nor reduced by delegating monetary policy to an independent and conservative central bank. Further, in flation and output stabilisation will be affected by the degree of conservativeness of the central bank and by the probability of the less in flation averse party gaining power. Keywords: rational partisan theory; fiscal policy; independent central bank JEL Classi fication: E58, E63.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diurnal oscillations of gene expression controlled by the circadian clock underlie rhythmic physiology across most living organisms. Although such rhythms have been extensively studied at the level of transcription and mRNA accumulation, little is known about the accumulation patterns of proteins. Here, we quantified temporal profiles in the murine hepatic proteome under physiological light-dark conditions using stable isotope labeling by amino acids quantitative MS. Our analysis identified over 5,000 proteins, of which several hundred showed robust diurnal oscillations with peak phases enriched in the morning and during the night and related to core hepatic physiological functions. Combined mathematical modeling of temporal protein and mRNA profiles indicated that proteins accumulate with reduced amplitudes and significant delays, consistent with protein half-life data. Moreover, a group comprising about one-half of the rhythmic proteins showed no corresponding rhythmic mRNAs, indicating significant translational or posttranslational diurnal control. Such rhythms were highly enriched in secreted proteins accumulating tightly during the night. Also, these rhythms persisted in clock-deficient animals subjected to rhythmic feeding, suggesting that food-related entrainment signals influence rhythms in circulating plasma factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the recognition of sounds to benefit perception and action, their neural representations should also encode their current spatial position and their changes in position over time. The dual-stream model of auditory processing postulates separate (albeit interacting) processing streams for sound meaning and for sound location. Using a repetition priming paradigm in conjunction with distributed source modeling of auditory evoked potentials, we determined how individual sound objects are represented within these streams. Changes in perceived location were induced by interaural intensity differences, and sound location was either held constant or shifted across initial and repeated presentations (from one hemispace to the other in the main experiment or between locations within the right hemispace in a follow-up experiment). Location-linked representations were characterized by differences in priming effects between pairs presented to the same vs. different simulated lateralizations. These effects were significant at 20-39 ms post-stimulus onset within a cluster on the posterior part of the left superior and middle temporal gyri; and at 143-162 ms within a cluster on the left inferior and middle frontal gyri. Location-independent representations were characterized by a difference between initial and repeated presentations, independently of whether or not their simulated lateralization was held constant across repetitions. This effect was significant at 42-63 ms within three clusters on the right temporo-frontal region; and at 165-215 ms in a large cluster on the left temporo-parietal convexity. Our results reveal two varieties of representations of sound objects within the ventral/What stream: one location-independent, as initially postulated in the dual-stream model, and the other location-linked.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sobre l'Independent Studies Programme (ISP) i el Programa d’Estudis Independents (PEI) del MACBA

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our work is focused on alleviating the workload for designers of adaptive courses on the complexity task of authoring adaptive learning designs adjusted to specific user characteristics and the user context. We propose an adaptation platform that consists in a set of intelligent agents where each agent carries out an independent adaptation task. The agents apply machine learning techniques to support the user modelling for the adaptation process

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The purpose of this study was to adapt and improve a minimally invasive two-step postmortem angiographic technique for use on human cadavers. Detailed mapping of the entire vascular system is almost impossible with conventional autopsy tools. The technique described should be valuable in the diagnosis of vascular abnormalities. MATERIALS AND METHODS: Postmortem perfusion with an oily liquid is established with a circulation machine. An oily contrast agent is introduced as a bolus injection, and radiographic imaging is performed. In this pilot study, the upper or lower extremities of four human cadavers were perfused. In two cases, the vascular system of a lower extremity was visualized with anterograde perfusion of the arteries. In the other two cases, in which the suspected cause of death was drug intoxication, the veins of an upper extremity were visualized with retrograde perfusion of the venous system. RESULTS: In each case, the vascular system was visualized up to the level of the small supplying and draining vessels. In three of the four cases, vascular abnormalities were found. In one instance, a venous injection mark engendered by the self-administration of drugs was rendered visible by exudation of the contrast agent. In the other two cases, occlusion of the arteries and veins was apparent. CONCLUSION: The method described is readily applicable to human cadavers. After establishment of postmortem perfusion with paraffin oil and injection of the oily contrast agent, the vascular system can be investigated in detail and vascular abnormalities rendered visible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.