969 resultados para Aikin, Lucy, 1781-1864


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A major group of murine NK T (NKT) cells express an invariant Vα14Jα18 TCR α-chain specific for glycolipid Ags presented by CD1d. Murine Vα14Jα18+ account for 30–50% of hepatic T cells and have potent antitumor activities. We have enumerated and characterized their human counterparts, Vα24Vβ11+ NKT cells, freshly isolated from histologically normal and tumor-bearing livers. In contrast to mice, human NKT cells are found in small numbers in healthy liver (0.5% of CD3+ cells) and blood (0.02%). In contrast to those in blood, most hepatic Vα24+ NKT cells express the Vβ11 chain. They include CD4+, CD8+, and CD4−CD8− cells, and many express the NK cell markers CD56, CD161, and/or CD69. Importantly, human hepatic Vα24+ T cells are potent producers of IFN-γ and TNF-α, but not IL-2 or IL-4, when stimulated pharmacologically or with the NKT cell ligand, α-galactosylceramide. Vα24+Vβ11+ cell numbers are reduced in tumor-bearing compared with healthy liver (0.1 vs 0.5%; p < 0.04). However, hepatic cells from cancer patients and healthy donors release similar amounts of IFN-γ in response to α-galactosylceramide. These data indicate that hepatic NKT cell repertoires are phenotypically and functionally distinct in humans and mice. Depletions of hepatic NKT cell subpopulations may underlie the susceptibility to metastatic liver disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Specialist scholarly books, including monographs, allow researchers to present their work, pose questions and to test and extend areas of theory through long-form writing. In spite of the fact that research communities all over the world value monographs and depend heavily on them as a requirement of tenure and promotion in many disciplines, sales of this kind of book are in free fall, with some estimates suggesting declines of as much as 90% over twenty years (Willinsky 2006). Cashstrapped monograph publishers have found themselves caught in a negative cycle of increasing prices and falling sales, with few resources left to support experimentation, business model innovation or engagement with digital technology and Open Access (OA). This chapter considers an important attempt to tackle failing markets for scholarly monographs, and to enable the wider adoption of OA licenses for book-length works: the 2012 – 2014 Knowledge Unlatched pilot. Knowledge Unlatched is a bold attempt to reconfigure the market for specialist scholarly books: moving it beyond the sale of ‘content’ towards a model that supports the services valued by scholarly and wider communities in the context of digital possibility. Its success has powerful implications for the way we understand copyright’s role in the creative industries, and the potential for established institutions and infrastructure to support the open and networked dynamics of a digital age.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Networked digital technologies and Open Access (OA) are transforming the processes and institutions of research, knowledge creation and dissemination globally: enabling new forms of collaboration, allowing researchers to be seen and heard in new ways and reshaping relationships between stakeholders across the global academic publishing system. This article draws on Joseph Nye’s concept of ‘Soft Power’ to explore the role that OA is playing in helping to reshape academic publishing in China. It focusses on two important areas of OA development: OA journals and national-level repositories. OA is being supported at the highest levels, and there is potential for it to play an important role in increasing the status and impact of Chinese scholarship. Investments in OA also have the potential to help China to re-position itself within international copyright discourses: moving beyond criticism for failure to enforce the rights of foreign copyright owners and progressing an agenda that places greater emphasis on equality of access to the resources needed to foster innovation. However, the potential for OA to help China to build and project its soft power is being limited by the legacies of the print era, as well as the challenges of efficiently governing the national research and innovation systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose The purpose of this paper is to reduce the potential for litigation by improving valuers’ awareness of water risks. As part of a valuer’s due diligence, the paper provides guidance as to how to identify such risks by explaining the different types and examining how online search tools can be used in conjunction with more traditional methods to evaluate the probability of these risks occurring. Design/methodology/approach The paper builds on prior research, which examined the impact of water to and for valuations. By means of legal/doctrinal analysis, this paper considers relevant issues from the perspective of managing client expectations and needs. In so doing it identifies online tools available to assist in identifying at risk properties and better informing clients. Findings While the internet provides a variety of tools to gain access to relevant information, this information most commonly is only provided subject to disclaimer. Valuers need to ensure that blind reliance is not given to use of these tools but that the tools are used in conjunction with individual property inspections. Research limitations/implications Although the examples considered primarily are Australian, increasing water risks generally make the issues considered relevant for any jurisdiction. The research will be of particular interests to practitioners in coastal or riverine areas. Practical implications Valuation reports are sought for a variety of purposes from a variety of clients. These range from the experienced, knowledgeable developer looking to maximise available equity to the inexperienced, uneducated individual looking to acquire their home and thinking more often than not with their heart not their head. More informed practices by valuers will lead to valuation reports being more easily understood by clients, thus lessening the likelihood of litigation against the valuer for negligence. Originality/value The paper highlights the issue of water risks; the need for valuers to properly address potential and actual risks in their reports; and the corresponding need to undertake all appropriate searches and enquiries of the property to be valued. It reinforces the importance of access to the internet as a tool in the valuation process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatility-hygroscopicity tandem differential mobility analyzer measurements were used to infer the composition of sub-100 nm diameter Southern Ocean marine aerosols at Cape Grim in November and December 2007. This study focuses on a short-lived high sea spray aerosol (SSA) event on 7–8 December with two externally mixed modes in the Hygroscopic Growth Factor (HGF) distributions (90% relative humidity (RH)), one at HGF > 2 and another at HGF~1.5. The particles with HGF > 2 displayed a deliquescent transition at 73–75% RH and were nonvolatile up to 280°C, which identified them as SSA particles with a large inorganic sea-salt fraction. SSA HGFs were 3–13% below those for pure sea-salt particles, indicating an organic volume fraction (OVF) of up to 11–46%. Observed high inorganic fractions in sub-100 nm SSA is contrary to similar, earlier studies. HGFs increased with decreasing particle diameter over the range 16–97 nm, suggesting a decreased OVF, again contrary to earlier studies. SSA comprised up to 69% of the sub-100 nm particle number, corresponding to concentrations of 110–290 cm−3. Air mass back trajectories indicate that SSA particles were produced 1500 km, 20–40 h upwind of Cape Grim. Transmission electron microscopy (TEM) and X-ray spectrometry measurements of sub-100 nm aerosols collected from the same location, and at the same time, displayed a distinct lack of sea salt. Results herein highlight the potential for biases in TEM analysis of the chemical composition of marine aerosols.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the 3D Water Chemistry Atlas - an open source, Web-based system that enables the three-dimensional (3D) sub-surface visualization of ground water monitoring data, overlaid on the local geological model. Following a review of existing technologies, the system adopts Cesium (an open source Web-based 3D mapping and visualization interface) together with a PostGreSQL/PostGIS database, for the technical architecture. In addition a range of the search, filtering, browse and analysis tools were developed that enable users to interactively explore the groundwater monitoring data and interpret it spatially and temporally relative to the local geological formations and aquifers via the Cesium interface. The result is an integrated 3D visualization system that enables environmental managers and regulators to assess groundwater conditions, identify inconsistencies in the data, manage impacts and risks and make more informed decisions about activities such as coal seam gas extraction, waste water extraction and re-use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dry sliding wear behavior of die-cast ADC12 aluminum alloy composites reinforced with short alumina fibers were investigated by using a pin-on-disk wear tester. The Al2O3 fibers were 4 mu m in diameter and were present in volume fractions (T-f)ranging from 0.03 to 0.26, The length of the fiber varied from 40 to 200 mu m. Disks of aluminum-alumina composites were rubbed against a pin of nitrided stainless steel SUS440B with a load of 10 N at a sliding velocity of 0.1 m/s. The unreinforced ADC 12 aluminum alloy and their composites containing low volume fractions of alumina (V-f approximate to 0.05) showed a sliding-distance-dependent transition from severe to mild wear. However, composites containing high volume fractions of alumina ( V-f > 0.05) exhibited only mild wear for all sliding distances. The duration of occurrence of the severe wear regime and the wear rate both decrease with increasing volume fraction. In MMCs the wear rate in the mild wear regime decreases with increase in volume fraction: reaching a minimum value at V-f = 0.09 Beyond V-f = 0.09 the wear rate increasesmarginally. On the other hand, the wear rate of the counterface (steel pin) was found to increase moderately with increase in V-f. From the analysis of wear data and detailed examination of (a) worn surfaces, (b) their cross-sections and (c) wear debris, two modes of wear mechanisms have been identified to be operative, in these materials and these are: (i) adhesive wear in the case of unreinforced matrix material and in MMCs with low Vf and (ii) abrasive wear in the case of MMCs with high V-f. (C) 2000 Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of high-quality tin monosulphide (SnS) layers is one of the crucial tasks in the fabrication of efficient SnS-based optoelectronic devices. Reduction of strain between film and the substrate by using an appropriate lattice-matched (LM) substrate is a new attempt for the growth of high-quality layers. In this view, the SnS films were deposited on LM Al substrate using the thermal evaporation technique with a low rate of evaporation. The as-grown SnS films were characterized using appropriate techniques and the obtained results are discussed by comparing them with the properties of SnS films grown on amorphous substrate under the same conditions. From structural analysis of the films, it is noticed that the SnS films deposited on amorphous substrate have crystallites that were oriented along different directions. However, the SnS crystallites grown on Al substrate exhibited epitaxial growth along the 101] direction. Photoluminescence (PL) and Raman studies reveal that the films grown on Al substrate have better optical properties than those of the films grown on amorphous substrates. (C) 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, we investigated nonlinear measures of chaos of QT interval time series in 28 normal control subjects, 36 patients with panic disorder and 18 patients with major depression in supine and standing postures. We obtained the minimum embedding dimension (MED) and the largest Lyapunov exponent (LLE) of instantaneous heart rate (HR) and QT interval series. MED quantifies the system's complexity and LLE predictability. There was a significantly lower MED and a significantly increased LLE of QT interval time series in patients. Most importantly, nonlinear indices of QT/HR time series, MEDqthr (MED of QT/HR) and LLEqthr (LLE of QT/HR), were highly significantly different between controls and both patient groups in either posture. Results remained the same even after adjusting for age. The increased LLE of QT interval time, series in patients with anxiety and depression is in line with our previous findings of higher QTvi (QT variability index, a log ratio of QT variability corrected for mean QT squared divided by heart rate variability corrected for mean heart rate squared) in these patients, using linear techniques. Increased LLEqthr (LLE of QT/HR) may be a more sensitive tool to study cardiac repolarization and a valuable addition to the time domain measures such as QTvi. This is especially important in light of the finding that LLEqthr correlated poorly and nonsignificantly with QTvi. These findings suggest an increase in relative cardiac sympathetic activity and a decrease in certain aspects of cardiac vagal function in patients with anxiety as well as depression. The lack of correlation between QTvi and LLEqthr suggests that this nonlinear index is a valuable addition to the linear measures. These findings may also help to explain the higher incidence of cardiovascular mortality in patients with anxiety and depressive disorders. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This brief presents the capturability analysis of a 3-D Retro-proportional navigation (Retro-PN) guidance law, which uses a negative navigation constant (as against the usual positive one), for intercepting targets having higher speeds than interceptors. This modification makes it possible to achieve collision conditions that were inaccessible to the standard PN law. A modified polar coordinate system, that makes the model more compact, is used in this brief for capturability analysis. In addition to the ratio of the target to interceptor speeds, the directional cosines of the interceptor, and target velocity vectors play a crucial role in the capturability. The existence of nontrivial capture zone of the Retro-PN guidance law and necessary and sufficient conditions, for capturing the target in finite time, are presented. A sufficient condition on the navigation constant is derived to ensure finiteness of the line-of-sight turn rate. The results are more extensive than those available for 2-D engagements, which can be obtained as special cases of this brief. Simulation results are given to support the analytical results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The degree to which the lithosphere and mantle are coupled and contribute to surface deformation beneath continental regions remains a fundamental question in the field of geodynamics. Here we use a new approach with a surface deformation field constrained by GPS, geologic, and seismicity data, together with a lithospheric geodynamic model, to solve for tractions inferred to be generated by mantle convection that (1) drive extension within interior Alaska generating southward directed surface motions toward the southern convergent plate boundary, (2) result in accommodation of the relative motions between the Pacific and North America in a comparatively small zone near the plate boundary, and (3) generate the observed convergence within the North American plate interior in the Mackenzie mountains in northwestern Canada. The evidence for deeper mantle influence on surface deformation beneath a continental region suggests that this mechanism may be an important contributing driver to continental plate assemblage and breakup.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The information-theoretic approach to security entails harnessing the correlated randomness available in nature to establish security. It uses tools from information theory and coding and yields provable security, even against an adversary with unbounded computational power. However, the feasibility of this approach in practice depends on the development of efficiently implementable schemes. In this paper, we review a special class of practical schemes for information-theoretic security that are based on 2-universal hash families. Specific cases of secret key agreement and wiretap coding are considered, and general themes are identified. The scheme presented for wiretap coding is modular and can be implemented easily by including an extra preprocessing layer over the existing transmission codes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

结合近年来发表的对金属材料采用球形压头的微压痕实验结果及Johnson对应原理,讨论了在传统弹塑性理论下锥形压头在计及压头顶端曲率半径影响时硬度的解答形式。进而得出结论:压头尖端曲率半径不是引起尺度效应的根源,相反,它全使尺度效应减弱。