959 resultados para Front-end
Resumo:
Chronic graft-versus-host disease (cGVHD) is a frequent cause of morbimortality after allogeneic hematopoietic stem cell transplantation (allo-HSCT), and severely compromises patients' physical capacity. Despite the aggressive nature of the disease, aerobic exercise training can positively impact survival as well as clinical and functional parameters. We analyzed potential mechanisms underlying the recently reported cardiac function improvement in an exercise-trained cGVHD murine model receiving lethal total body irradiation and immunosuppressant treatment (Fiuza-Luces et al., 2013. Med Sci Sports Exerc 45, 1703-1711). We hypothesized that a cellular quality-control mechanism that is receiving growing attention in biomedicine, autophagy, was involved in such improvement. Our results suggest that exercise training elicits a positive autophagic adaptation in the myocardium that may help preserve cardiac function even at the end-stage of a devastating disease like cGVHD. These preliminary findings might provide new insights into the cardiac exercise benefits in chronic/debilitating conditions.
Resumo:
Grattan, J.P., Gilbertson, D.D., Hunt, C.O. (2007). The local and global dimensions of metaliferrous air pollution derived from a reconstruction of an 8 thousand year record of copper smelting and mining at a desert-mountain frontier in southern Jordan. Journal of Archaeological Science 34, 83-110
Resumo:
Urquhart, C. (editor for JUSTEIS team), Spink, S., Thomas, R., Yeoman, A., Durbin, J., Turner, J., Armstrong, A., Lonsdale, R. & Fenton, R. (2003). JUSTEIS (JISC Usage Surveys: Trends in Electronic Information Services) Strand A: survey of end users of all electronic information services (HE and FE), with Action research report. Final report 2002/2003 Cycle Four. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth with Information Automation Ltd (CIQM). Sponsorship: JISC
Resumo:
Williams, H. (2006). Ludwig Feuerbach's Critique of Religion and the End of Moral Philosophy. In Moggach, D. (Ed.), The New Hegelians: Politics and Philosophy in the Hegelian School (pp.50-66). Cambridge: Cambridge University Press. Introduction; Part I. Eduard Gans: 1. Eduard Gans on poverty and on the constitutional debate; 2. Ludwig Feuerbach's Critique of Religion and the end of moral philosophy; Part II. Ludwig Feuerbach: 3. The symbolic dimension and the politics of Left Hegelianism; Part III. Bruno Bauer: 4. Exclusiveness and political universalism in Bruno Bauer; 5. Republican rigorism and emancipation in Bruno Bauer; Part IV. Edgar Bauer: 6. Edgar Bauer and The Origins of the Theory of Terrorism; Max Stirner 7. Ein Menschenleben: Hegel and Stirner; 8. 'The State and I': Max Stirner's anarchism; Friedrich Engels: 9. Engels and the invention of the catastrophist conception of the industrial revolution; Karl Marx: 10. The basis of the state in the Marx of 1842; 11. Marx and Feuerbachian essence: returning to the question of 'Human Essence' in historical materialism; 12. Freedom and the 'Realm of Necessity'; Concluding with Hegel :13. Work, language and community: a response to Hegel's critics. RAE2008
Resumo:
http://books.google.com/books?vid=OCLC55772204
Resumo:
Current low-level networking abstractions on modern operating systems are commonly implemented in the kernel to provide sufficient performance for general purpose applications. However, it is desirable for high performance applications to have more control over the networking subsystem to support optimizations for their specific needs. One approach is to allow networking services to be implemented at user-level. Unfortunately, this typically incurs costs due to scheduling overheads and unnecessary data copying via the kernel. In this paper, we describe a method to implement efficient application-specific network service extensions at user-level, that removes the cost of scheduling and provides protected access to lower-level system abstractions. We present a networking implementation that, with minor modifications to the Linux kernel, passes data between "sandboxed" extensions and the Ethernet device without copying or processing in the kernel. Using this mechanism, we put a customizable networking stack into a user-level sandbox and show how it can be used to efficiently process and forward data via proxies, or intermediate hosts, in the communication path of high performance data streams. Unlike other user-level networking implementations, our method makes no special hardware requirements to avoid unnecessary data copies. Results show that we achieve a substantial increase in throughput over comparable user-space methods using our networking stack implementation.
Resumo:
ERRATA: We present corrections to Fact 3 and (as a consequence) to Lemma 1 of BUCS Technical Report BUCS-TR-2000-013 (also published in IEEE INCP'2000)[1]. These corrections result in slight changes to the formulae used for the identifications of shared losses, which we quantify.
Resumo:
Recent work has shown the prevalence of small-world phenomena [28] in many networks. Small-world graphs exhibit a high degree of clustering, yet have typically short path lengths between arbitrary vertices. Internet AS-level graphs have been shown to exhibit small-world behaviors [9]. In this paper, we show that both Internet AS-level and router-level graphs exhibit small-world behavior. We attribute such behavior to two possible causes–namely the high variability of vertex degree distributions (which were found to follow approximately a power law [15]) and the preference of vertices to have local connections. We show that both factors contribute with different relative degrees to the small-world behavior of AS-level and router-level topologies. Our findings underscore the inefficacy of the Barabasi-Albert model [6] in explaining the growth process of the Internet, and provide a basis for more promising approaches to the development of Internet topology generators. We present such a generator and show the resemblance of the synthetic graphs it generates to real Internet AS-level and router-level graphs. Using these graphs, we have examined how small-world behaviors affect the scalability of end-system multicast. Our findings indicate that lower variability of vertex degree and stronger preference for local connectivity in small-world graphs results in slower network neighborhood expansion, and in longer average path length between two arbitrary vertices, which in turn results in better scaling of end system multicast.
Resumo:
End-to-End differentiation between wireless and congestion loss can equip TCP control so it operates effectively in a hybrid wired/wireless environment. Our approach integrates two techniques: packet loss pairs (PLP) and Hidden Markov Modeling (HMM). A packet loss pair is formed by two back-to-back packets, where one packet is lost while the second packet is successfully received. The purpose is for the second packet to carry the state of the network path, namely the round trip time (RTT), at the time the other packet is lost. Under realistic conditions, PLP provides strong differentiation between congestion and wireless type of loss based on distinguishable RTT distributions. An HMM is then trained so observed RTTs can be mapped to model states that represent either congestion loss or wireless loss. Extensive simulations confirm the accuracy of our HMM-based technique in classifying the cause of a packet loss. We also show the superiority of our technique over the Vegas predictor, which was recently found to perform best and which exemplifies other existing loss labeling techniques.
Resumo:
Current Internet transport protocols make end-to-end measurements and maintain per-connection state to regulate the use of shared network resources. When two or more such connections share a common endpoint, there is an opportunity to correlate the end-to-end measurements made by these protocols to better diagnose and control the use of shared resources. We develop packet probing techniques to determine whether a pair of connections experience shared congestion. Correct, efficient diagnoses could enable new techniques for aggregate congestion control, QoS admission control, connection scheduling and mirror site selection. Our extensive simulation results demonstrate that the conditional (Bayesian) probing approach we employ provides superior accuracy, converges faster, and tolerates a wider range of network conditions than recently proposed memoryless (Markovian) probing approaches for addressing this opportunity.
Resumo:
Coeliac disease is one of the most common food intolerances worldwide and at present the gluten free diet remains the only suitable treatment. A market overview conducted as part of this thesis on nutritional and sensory quality of commercially available gluten free breads and pasta showed that improvements are necessary. Many products show strong off-flavors, poor mouthfeel and reduced shelf-life. Since the life-long avoidance of the cereal protein gluten means a major change to the diet, it is important to also consider the nutritional value of products intending to replace staple foods such as bread or pasta. This thesis addresses this issue by characterising available gluten free cereal and pseudocereal flours to facilitate a better raw material choice. It was observed that especially quinoa, buckwheat and teff are high in essential nutrients, such as protein, minerals and folate. In addition the potential of functional ingredients such as inulin, β-glucan, HPMC and xanthan to improve loaf quality were evaluated. Results show that these ingredients can increase loaf volume and reduce crumb hardness as well as rate of staling but that the effect diverges strongly depending on the bread formulation used. Furthermore, fresh egg pasta formulations based on teff and oat flour were developed. The resulting products were characterised regarding sensory and textural properties as well as in vitro digestibility. Scanning electron and confocal laser scanning microscopy was used throughout the thesis to visualise structural changes occurring during baking and pasta making
Resumo:
This thesis argues that through the prism of America’s Cold War, scientism has emerged as the metanarrative of the postnuclear age. The advent of the bomb brought about a new primacy for mechanical and hyperrational thinking in the corridors of power not just in terms of managing the bomb itself but diffusing this ideology throughout the culture in social sciences, economics and other such institutional systems. The human need to mitigate or ameliorate against the chaos of the universe lies at the heart of not just religious faith but in the desire for perfect control. Thus there has been a transference of power from religious faith to the apparent material power of science and technology and the terra firma these supposedly objective means supply. The Cold War, however was a highly ideologically charged opposition between the two superpowers, and the scientific methodology that sprang forth to manage the Cold War and the bomb, in the United States, was not an objective scientific system divorced from the paranoia and dogma but a system that assumed a radically fundamentalist idea of capitalism. This is apparent in the widespread diffusion of game theory throughout Western postindustrial institutions. The inquiry of the thesis thus examines the texts that engage and criticise American Cold War methodology, beginning with the nuclear moment, so to speak, and Dr Strangelove’s incisive satire of moral abdication to machine processes. Moving on chronologically, the thesis examines the diffusion of particular kinds of masculinity and sexuality in postnuclear culture in Crash and End Zone and finishing up its analysis with the ethnographic portrayal of a modern American city in The Wire. More than anything else, the thesis wishes to reveal to what extent this technocratic consciousness puts pressure on language and on binding narratives.
Resumo:
In the last decade, we have witnessed the emergence of large, warehouse-scale data centres which have enabled new internet-based software applications such as cloud computing, search engines, social media, e-government etc. Such data centres consist of large collections of servers interconnected using short-reach (reach up to a few hundred meters) optical interconnect. Today, transceivers for these applications achieve up to 100Gb/s by multiplexing 10x 10Gb/s or 4x 25Gb/s channels. In the near future however, data centre operators have expressed a need for optical links which can support 400Gb/s up to 1Tb/s. The crucial challenge is to achieve this in the same footprint (same transceiver module) and with similar power consumption as today’s technology. Straightforward scaling of the currently used space or wavelength division multiplexing may be difficult to achieve: indeed a 1Tb/s transceiver would require integration of 40 VCSELs (vertical cavity surface emitting laser diode, widely used for short‐reach optical interconnect), 40 photodiodes and the electronics operating at 25Gb/s in the same module as today’s 100Gb/s transceiver. Pushing the bit rate on such links beyond today’s commercially available 100Gb/s/fibre will require new generations of VCSELs and their driver and receiver electronics. This work looks into a number of state‐of-the-art technologies and investigates their performance restraints and recommends different set of designs, specifically targeting multilevel modulation formats. Several methods to extend the bandwidth using deep submicron (65nm and 28nm) CMOS technology are explored in this work, while also maintaining a focus upon reducing power consumption and chip area. The techniques used were pre-emphasis in rising and falling edges of the signal and bandwidth extensions by inductive peaking and different local feedback techniques. These techniques have been applied to a transmitter and receiver developed for advanced modulation formats such as PAM-4 (4 level pulse amplitude modulation). Such modulation format can increase the throughput per individual channel, which helps to overcome the challenges mentioned above to realize 400Gb/s to 1Tb/s transceivers.
Resumo:
PURPOSE: Overall survival (OS) can be observed only after prolonged follow-up, and any potential effect of first-line therapies on OS may be confounded by the effects of subsequent therapy. We investigated whether tumor response, disease control, progression-free survival (PFS), or time to progression (TTP) could be considered a valid surrogate for OS to assess the benefits of first-line therapies for patients with metastatic breast cancer. PATIENTS AND METHODS: Individual patient data were collected on 3,953 patients in 11 randomized trials that compared an anthracycline (alone or in combination) with a taxane (alone or in combination with an anthracycline). Surrogacy was assessed through the correlation between the end points as well as through the correlation between the treatment effects on the end points. RESULTS: Tumor response (survival odds ratio [OR], 6.2; 95% CI, 5.3 to 7.0) and disease control (survival OR, 5.5; 95% CI, 4.8 to 6.3) were strongly associated with OS. PFS (rank correlation coefficient, 0.688; 95% CI, 0.686 to 0.690) and TTP (rank correlation coefficient, 0.682; 95% CI, 0.680 to 0.684) were moderately associated with OS. Response log ORs were strongly correlated with PFS log hazard ratios (linear coefficient [rho], 0.96; 95% CI, 0.73 to 1.19). Response and disease control log ORs and PFS and TTP log hazard ratios were poorly correlated with log hazard ratios for OS, but the confidence limits of rho were too wide to be informative. CONCLUSION: No end point could be demonstrated as a good surrogate for OS in these trials. Tumor response may be an acceptable surrogate for PFS.
Resumo:
PURPOSE: Little is known about young caregivers of people with advanced life-limiting illness. Better understanding of the needs and characteristics of these young caregivers can inform development of palliative care and other support services. METHODS: A population-based analysis of caregivers was performed from piloted questions included in the 2001-2007 face-to-face annual health surveys of 23,706 South Australians on the death of a loved one, caregiving provided, and characteristics of the deceased individual and caregiver. The survey was representative of the population by age, gender, and region of residence. FINDINGS: Most active care was provided by older, close family members, but large numbers of young people (ages 15-29) also provided assistance to individuals with advanced life-limiting illness. They comprised 14.4% of those undertaking "hands-on" care on a daily or intermittent basis, whom we grouped together as active caregivers. Almost as many young males as females participate in active caregiving (men represent 46%); most provide care while being employed, including 38% who work full-time. Over half of those engaged in hands-on care indicated the experience to be worse or much worse than expected, with young people more frequently reporting dissatisfaction thereof. Young caregivers also exhibited an increased perception of the need for assistance with grief. CONCLUSION: Young people can be integral to end-of-life care, and represent a significant cohort of active caregivers with unique needs and experiences. They may have a more negative experience as caregivers, and increased needs for grief counseling services compared to other age cohorts of caregivers.