903 resultados para Advanced application and branching systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Endothelial tip cells guide angiogenic sprouts by exploring the local environment for guidance cues such as vascular endothelial growth factor (VegfA). Here we present Flt1 (Vegf receptor 1) loss- and gain-of-function data in zebrafish showing that Flt1 regulates tip cell formation and arterial branching morphogenesis. Zebrafish embryos expressed soluble Flt1 (sFlt1) and membrane-bound Flt1 (mFlt1). In Tg(flt1(BAC):yfp) × Tg(kdrl:ras-cherry)(s916) embryos, flt1:yfp was expressed in tip, stalk and base cells of segmental artery sprouts and overlapped with kdrl:cherry expression in these domains. flt1 morphants showed increased tip cell numbers, enhanced angiogenic behavior and hyperbranching of segmental artery sprouts. The additional arterial branches developed into functional vessels carrying blood flow. In support of a functional role for the extracellular VEGF-binding domain of Flt1, overexpression of sflt1 or mflt1 rescued aberrant branching in flt1 morphants, and overexpression of sflt1 or mflt1 in controls resulted in short arterial sprouts with reduced numbers of filopodia. flt1 morphants showed reduced expression of Notch receptors and of the Notch downstream target efnb2a, and ectopic expression of flt4 in arteries, consistent with loss of Notch signaling. Conditional overexpression of the notch1a intracellular cleaved domain in flt1 morphants restored segmental artery patterning. The developing nervous system of the trunk contributed to the distribution of Flt1, and the loss of flt1 affected neurons. Thus, Flt1 acts in a Notch-dependent manner as a negative regulator of tip cell differentiation and branching. Flt1 distribution may be fine-tuned, involving interactions with the developing nervous system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MEG beamformer algorithms work by making the assumption that correlated and spatially distinct local field potentials do not develop in the human brain. Despite this assumption, images produced by such algorithms concur with those from other non-invasive and invasive estimates of brain function. In this paper we set out to develop a method that could be applied to raw MEG data to explicitly test his assumption. We show that a promax rotation of MEG channel data can be used as an approximate estimator of the number of spatially distinct correlated sources in any frequency band.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The reasons of a restricted applicability of the models of decision making in social and economic systems. 3 basic principles of growth of their adequacy are proposed: "localization" of solutions, direct account of influencing of the individual on process of decision making ("subjectivity of objectivity") and reduction of influencing of the individual psychosomatic characteristics of the subject (" objectivity of subjectivity ") are offered. The principles are illustrated on mathematical models of decision making in ecologically- economic and social systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper considers the use and the information support of the most important mathematical Application Packages (AP), such as Maple, Matlab, Mathcad, Mathematica, Statistica and SPSS – mostly used during Calculus tuition in Universities. The main features of the packages and the information support in the sites of the producers are outlined, as well as their capacity for work in Internet, together with educational sites and literature related to them. The most important resources of the TeX system for preparation of mathematical articles and documents are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we analyze the sensitivities of coherent optical receivers and microwave receivers. We derive theoretical limits of signal-to-noise ratio and bit error rate. By applying a generic approach to a broad range of receivers, we can compare their performance directly. Other publications have considered some of these receivers. However, their diverse nature obscures the big picture. Using our results as a unifying platform, previous publications can be compared and discrepancies between them identified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

These lecture notes are devoted to present several uses of Large Deviation asymptotics in Branching Processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2002 Mathematics Subject Classification: 35L40

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are many steps involved in developing a drug candidate into a formulated medicine and many involve analysis of chemical interaction or physical change. Calorimetry is particularly suited to such analyses as it offers the capacity to observe and quantify both chemical and physical changes in virtually any sample. Differential scanning calorimetry (DSC) is ubiquitous in pharmaceutical development, but the related technique of isothermal calorimetry (IC) is complementary and can be used to investigate a range of processes not amenable to analysis by DSC. Typically, IC is used for longer-term stability indicating or excipient compatibility assays because both the temperature and relative humidity (RH) in the sample ampoule can be controlled. However, instrument design and configuration, such as titration, gas perfusion or ampoule-breaking (solution) calorimetry, allow quantification of more specific values, such as binding enthalpies, heats of solution and quantification of amorphous content. As ever, instrument selection, experiment design and sample preparation are critical to ensuring the relevance of any data recorded. This article reviews the use of isothermal, titration, gas-perfusion and solution calorimetry in the context of pharmaceutical development, with a focus on instrument and experimental design factors, highlighted with examples from the recent literature. © 2011 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem on which this study focused was individuals' reduced capacity to respond to change and to engage in innovative learning when their reflective learning skills are limited. In this study, the preceding problem was addressed by two primary questions: To what degree can mastery of a strategy for reflective learning be facilitated as a part of an academic curriculum for professional practitioners? What impact will mastery of this strategy have on the learning style and adaptive flexibility of adult learners? The focus of the study was a direct application of human resource development technology in the professional preparation of teachers. The background of the problem in light of changing global paradigms and educational action orientations was outlined and a review of the literature was provided. Roots of thought for two key concepts (i.e., learning to learn from experience and meaningful reflection in learning) were traced. Reflective perspectives from the work of eight researchers were compared. A meta-model of learning from experience drawn from the literature served as a conceptual framework for the study. A strategy for reflective learning developed from this meta-model was taught to 109 teachers-in-training at Florida International University in Miami, Florida. Kolb's Adaptive Style Inventory and Learning Style Inventory were administered to the treatment group and to two control groups taught by the same professor. Three research questions and fourteen hypotheses guided data analysis. Qualitative review of 1565 personal documents generated by the treatment group indicated that 77 students demonstrated "double-loop" learning, going beyond previously established limits to perception, understanding, or action. The mean score for depth of reflection indicated "single-loop" learning with "reflection-in-action" present. The change in the mean score for depth of reflection from the beginning to end of the study was statistically significant (p $<$.05). On quantitative measures of adaptive flexibility and learning style, with two exceptions, there were no significant differences noted between treatment and control groups on pre-test to post-test differences and on post-test mean scores adjusted for pre-test responses and demographic variables. Conclusions were drawn regarding treatment, instrumentation, and application of the strategy and the meta-model. Implications of the strategy and the meta-model for research, for education, for human resource development, for professional practice, and for personal growth were suggested. Qualitative training materials and Kolb's instruments were provided in the appendices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increased occurrence of cyanobacteria (blue-green algae) blooms and the production of associated cyanotoxins have presented a threat to drinking water sources. Among the most common types of cyanotoxins found in potable water are microcystins (MCs), a family of cyclic heptapeptides containing substrates. MCs are strongly hepatotoxic and known to initiate tumor promoting activity. The presence of sub-lethal doses of MCs in drinking water is implicated as one of the key risk factors for an unusually high occurrence of primary liver cancer. ^ A variety of traditional water treatment methods have been attempted for the removal of cyanotoxins, but with limited success. Advanced Oxidation Technologies (AOTs) are attractive alternatives to traditional water treatments. We have demonstrated ultrasonic irradiation and UV/H2O2 lead to the degradation of cyanotoxins in drinking water. These studies demonstrate AOTs can effectively degrade MCs and their associated toxicity is dramatically reduced. We have conducted detailed studies of different degradation pathways of MCs and conclude that the hydroxyl radical is responsible for a significant fraction of the observed degradation. Results indicate preliminary products of the sonolysis of MCs are due to the hydroxyl radical attack on the benzene ring and substitution and cleavage of the diene of the Adda peptide residue. AOTs are attractive methods for treatment of cyanotoxins in potable water supplies. ^ The photochemical transformation of MCs is important in the environmental degradation of MCs. Previous studies implicated singlet oxygen as a primary oxidant in the photochemical transformation of MCs. Our results indicate that singlet oxygen predominantly leads to degradation of the phycocyanin, pigments of blue green algae, hence reducing the degradation of MCs. The predominant process involves isomerization of the diene (6E to 6Z) in the Adda side chain via photosensitized isomerization involving the photoexcited phycocyanin. Our results indicate that photosensitized processes play a key role in the environmental fate and elimination of MCs in the natural waters. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrical energy is an essential resource for the modern world. Unfortunately, its price has almost doubled in the last decade. Furthermore, energy production is also currently one of the primary sources of pollution. These concerns are becoming more important in data-centers. As more computational power is required to serve hundreds of millions of users, bigger data-centers are becoming necessary. This results in higher electrical energy consumption. Of all the energy used in data-centers, including power distribution units, lights, and cooling, computer hardware consumes as much as 80%. Consequently, there is opportunity to make data-centers more energy efficient by designing systems with lower energy footprint. Consuming less energy is critical not only in data-centers. It is also important in mobile devices where battery-based energy is a scarce resource. Reducing the energy consumption of these devices will allow them to last longer and re-charge less frequently. Saving energy in computer systems is a challenging problem. Improving a system's energy efficiency usually comes at the cost of compromises in other areas such as performance or reliability. In the case of secondary storage, for example, spinning-down the disks to save energy can incur high latencies if they are accessed while in this state. The challenge is to be able to increase the energy efficiency while keeping the system as reliable and responsive as before. This thesis tackles the problem of improving energy efficiency in existing systems while reducing the impact on performance. First, we propose a new technique to achieve fine grained energy proportionality in multi-disk systems; Second, we design and implement an energy-efficient cache system using flash memory that increases disk idleness to save energy; Finally, we identify and explore solutions for the page fetch-before-update problem in caching systems that can: (a) control better I/O traffic to secondary storage and (b) provide critical performance improvement for energy efficient systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary purpose of these studies was to determine the effect of planning menus using the Institute of Medicine's (IOMs) Simple Nutrient Density Approach on nutrient intakes of long-term care (LTC) residents. In the first study, nutrient intakes of 72 subjects were assessed using Dietary Reference Intakes (DRIs) and IOM methodology. The intake distributions were used to set intake and menu planning goals. In the second study, the facility's regular menus were modified to meet the intake goals for vitamin E, magnesium, zinc, vitamin D and calcium. An experiment was used to test whether the modified menu resulted in intakes of micronutrients sufficient to achieve a low prevalence (<3%) of nutrient inadequacies. Three-day weighed food intakes for 35 females were adjusted for day-to-day variations in order to obtain an estimate of long-term average intake and to estimate the proportion of residents with inadequate nutrient intakes. ^ In the first study, the prevalence of inadequate intakes was determined to be between 65-99% for magnesium, vitamin E, and zinc. Mean usual intakes of Vitamin D and calcium were far below the Adequate Intakes (AIs). In the experimental study, the prevalence of inadequacies was reduced to <3% for zinc and vitamin E but not magnesium. The groups' mean usual intake from the modified menu met or exceeded the AI for calcium but fell short for vitamin D. Alternatively, it was determined that addition of a multivitamin and mineral (MVM) supplement to intakes of the regular menu could be used to achieve goals for vitamin E, zinc and vitamin D but not calcium and magnesium. ^ A combination of menu modification and MVM supplementation may be necessary to achieve a low prevalence of micronutrient inadequacies among LTC residents. Menus should be planned to optimize intakes of those nutrients that are low in an MVM, such as calcium, magnesium, and potassium. A MVM supplement should be provided to fill the gap for nutrients not provided in sufficient amounts by the diet, such as vitamin E and vitamin D. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advanced Placement is a series of courses and tests designed to determine mastery over introductory college material. It has become part of the American educational system. The changing conception of AP was examined using critical theory to determine what led to a view of continual success. The study utilized David Armstrong's variation of Michel Foucault's critical theory to construct an analytical framework. Black and Ubbes' data gathering techniques and Braun and Clark's data analysis were utilized as the analytical framework. Data included 1135 documents: 641 journal articles, 421 newspaper articles and 82 government documents. ^ The study revealed three historical ruptures correlated to three themes containing subthemes. The first rupture was the Sputnik launch in 1958. Its correlated theme was AP leading to school reform with subthemes of AP as reform for able students and AP's gaining of acceptance from secondary schools and higher education. The second rupture was the Nation at Risk report published in 1983. Its correlated theme was AP's shift in emphasis from the exam to the course with the subthemes of AP as a course, a shift in AP's target population, using AP courses to promote equity, and AP courses modifying curricula. The passage of the No Child Left Behind Act of 2001 was the third rupture. Its correlated theme was AP as a means to narrow the achievement gap with the subthemes of AP as a college preparatory program and the shifting of AP to an open access program. ^ The themes revealed a perception that progressively integrated the program into American education. The AP program changed emphasis from tests to curriculum, and is seen as the nation's premier academic program to promote reform and prepare students for college. It has become a major source of income for the College Board. In effect, AP has become an agent of privatization, spurring other private entities into competition for government funding. The change and growth of the program over the past 57 years resulted in a deep integration into American education. As such the program remains an intrinsic part of the system and continues to evolve within American education. ^