900 resultados para Surfaces, Representation of.
Resumo:
BACKGROUND: The isolation of human monoclonal antibodies (mAbs) that neutralize a broad spectrum of primary HIV-1 isolates and the characterization of the human neutralizing antibody B cell response to HIV-1 infection are important goals that are central to the design of an effective antibody-based vaccine. METHODS AND FINDINGS: We immortalized IgG(+) memory B cells from individuals infected with diverse clades of HIV-1 and selected on the basis of plasma neutralization profiles that were cross-clade and relatively potent. Culture supernatants were screened using various recombinant forms of the envelope glycoproteins (Env) in multiple parallel assays. We isolated 58 mAbs that were mapped to different Env surfaces, most of which showed neutralizing activity. One mAb in particular (HJ16) specific for a novel epitope proximal to the CD4 binding site on gp120 selectively neutralized a multi-clade panel of Tier-2 HIV-1 pseudoviruses, and demonstrated reactivity that was comparable in breadth, but distinct in neutralization specificity, to that of the other CD4 binding site-specific neutralizing mAb b12. A second mAb (HGN194) bound a conserved epitope in the V3 crown and neutralized all Tier-1 and a proportion of Tier-2 pseudoviruses tested, irrespective of clade. A third mAb (HK20) with broad neutralizing activity, particularly as a Fab fragment, recognized a highly conserved epitope in the HR-1 region of gp41, but showed striking assay-dependent selectivity in its activity. CONCLUSIONS: This study reveals that by using appropriate screening methods, a large proportion of memory B cells can be isolated that produce mAbs with HIV-1 neutralizing activity. Three of these mAbs show unusual breadth of neutralization and therefore add to the current panel of HIV-1 neutralizing antibodies with potential for passive protection and template-based vaccine design.
Resumo:
BACKGROUND: With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. METHODS: Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. RESULTS: Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. CONCLUSIONS: This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows.
Resumo:
Phosphorus (P) is a crucial element for life and therefore for maintaining ecosystem productivity. Its local availability to the terrestrial biosphere results from the interaction between climate, tectonic uplift, atmospheric transport, and biotic cycling. Here we present a mathematical model that describes the terrestrial P-cycle in a simple but comprehensive way. The resulting dynamical system can be solved analytically for steady-state conditions, allowing us to test the sensitivity of the P-availability to the key parameters and processes. Given constant inputs, we find that humid ecosystems exhibit lower P availability due to higher runoff and losses, and that tectonic uplift is a fundamental constraint. In particular, we find that in humid ecosystems the biotic cycling seem essential to maintain long-term P-availability. The time-dependent P dynamics for the Franz Josef and Hawaii chronosequences show how tectonic uplift is an important constraint on ecosystem productivity, while hydroclimatic conditions control the P-losses and speed towards steady-state. The model also helps describe how, with limited uplift and atmospheric input, as in the case of the Amazon Basin, ecosystems must rely on mechanisms that enhance P-availability and retention. Our novel model has a limited number of parameters and can be easily integrated into global climate models to provide a representation of the response of the terrestrial biosphere to global change. © 2010 Author(s).
Resumo:
CONTEXT: In 1997, Congress authorized the US Food and Drug Administration (FDA) to grant 6-month extensions of marketing rights through the Pediatric Exclusivity Program if industry sponsors complete FDA-requested pediatric trials. The program has been praised for creating incentives for studies in children and has been criticized as a "windfall" to the innovator drug industry. This critique has been a substantial part of congressional debate on the program, which is due to expire in 2007. OBJECTIVE: To quantify the economic return to industry for completing pediatric exclusivity trials. DESIGN AND SETTING: A cohort study of programs conducted for pediatric exclusivity. Nine drugs that were granted pediatric exclusivity were selected. From the final study reports submitted to the FDA (2002-2004), key elements of the clinical trial design and study operations were obtained, and the cost of performing each study was estimated and converted into estimates of after-tax cash outflows. Three-year market sales were obtained and converted into estimates of after-tax cash inflows based on 6 months of additional market protection. Net economic return (cash inflows minus outflows) and net return-to-costs ratio (net economic return divided by cash outflows) for each product were then calculated. MAIN OUTCOME MEASURES: Net economic return and net return-to-cost ratio. RESULTS: The indications studied reflect a broad representation of the program: asthma, tumors, attention-deficit/hyperactivity disorder, hypertension, depression/generalized anxiety disorder, diabetes mellitus, gastroesophageal reflux, bacterial infection, and bone mineralization. The distribution of net economic return for 6 months of exclusivity varied substantially among products (net economic return ranged from -$8.9 million to $507.9 million and net return-to-cost ratio ranged from -0.68 to 73.63). CONCLUSIONS: The economic return for pediatric exclusivity is variable. As an incentive to complete much-needed clinical trials in children, pediatric exclusivity can generate lucrative returns or produce more modest returns on investment.
Resumo:
A framework for adaptive and non-adaptive statistical compressive sensing is developed, where a statistical model replaces the standard sparsity model of classical compressive sensing. We propose within this framework optimal task-specific sensing protocols specifically and jointly designed for classification and reconstruction. A two-step adaptive sensing paradigm is developed, where online sensing is applied to detect the signal class in the first step, followed by a reconstruction step adapted to the detected class and the observed samples. The approach is based on information theory, here tailored for Gaussian mixture models (GMMs), where an information-theoretic objective relationship between the sensed signals and a representation of the specific task of interest is maximized. Experimental results using synthetic signals, Landsat satellite attributes, and natural images of different sizes and with different noise levels show the improvements achieved using the proposed framework when compared to more standard sensing protocols. The underlying formulation can be applied beyond GMMs, at the price of higher mathematical and computational complexity. © 1991-2012 IEEE.
Resumo:
© 2014, Springer-Verlag Berlin Heidelberg.This study assesses the skill of advanced regional climate models (RCMs) in simulating southeastern United States (SE US) summer precipitation and explores the physical mechanisms responsible for the simulation skill at a process level. Analysis of the RCM output for the North American Regional Climate Change Assessment Program indicates that the RCM simulations of summer precipitation show the largest biases and a remarkable spread over the SE US compared to other regions in the contiguous US. The causes of such a spread are investigated by performing simulations using the Weather Research and Forecasting (WRF) model, a next-generation RCM developed by the US National Center for Atmospheric Research. The results show that the simulated biases in SE US summer precipitation are due mainly to the misrepresentation of the modeled North Atlantic subtropical high (NASH) western ridge. In the WRF simulations, the NASH western ridge shifts 7° northwestward when compared to that in the reanalysis ensemble, leading to a dry bias in the simulated summer precipitation according to the relationship between the NASH western ridge and summer precipitation over the southeast. Experiments utilizing the four dimensional data assimilation technique further suggest that the improved representation of the circulation patterns (i.e., wind fields) associated with the NASH western ridge substantially reduces the bias in the simulated SE US summer precipitation. Our analysis of circulation dynamics indicates that the NASH western ridge in the WRF simulations is significantly influenced by the simulated planetary boundary layer (PBL) processes over the Gulf of Mexico. Specifically, a decrease (increase) in the simulated PBL height tends to stabilize (destabilize) the lower troposphere over the Gulf of Mexico, and thus inhibits (favors) the onset and/or development of convection. Such changes in tropical convection induce a tropical–extratropical teleconnection pattern, which modulates the circulation along the NASH western ridge in the WRF simulations and contributes to the modeled precipitation biases over the SE US. In conclusion, our study demonstrates that the NASH western ridge is an important factor responsible for the RCM skill in simulating SE US summer precipitation. Furthermore, the improvements in the PBL parameterizations for the Gulf of Mexico might help advance RCM skill in representing the NASH western ridge circulation and summer precipitation over the SE US.
Resumo:
Pianists of the twenty-first century have a wealth of repertoire at their fingertips. They busily study music from the different periods -- Baroque, Classical, Romantic, and some of the twentieth century -- trying to understand the culture and performance practice of the time and the stylistic traits of each composer so they can communicate their music effectively. Unfortunately, this leaves little time to notice the composers who are writing music today. Whether this neglect proceeds from lack of time or lack of curiosity, I feel we should be connected to music that was written in our own lifetime, when we already understand the culture and have knowledge of the different styles that preceded us. Therefore, in an attempt to promote today’s composers, I have selected piano music written during my lifetime, to show that contemporary music is effective and worthwhile and deserves as much attention as the music that preceded it. This dissertation showcases piano music composed from 1978 to 2005. A point of departure in selecting the pieces for this recording project is to represent the major genres in the piano repertoire in order to show a variety of styles, moods, lengths, and difficulties. Therefore, from these recordings, there is enough variety to successfully program a complete contemporary recital from the selected works, and there is enough variety to meet the demands of pianists with different skill levels and recital programming needs. Since we live in an increasingly global society, music from all parts of the world is included to offer a fair representation of music being composed everywhere. Half of the music in this project comes from the United States. The other half comes from Australia, Japan, Russia, and Argentina. The composers represented in these recordings are: Lowell Liebermann, Richard Danielpour, Frederic Rzewski, Judith Lang Zaimont, Samuel Adler, Carl Vine, Nikolai Kapustin, Akira Miyoshi and Osvaldo Golijov. With the exception of one piano concerto, all the works are for solo piano. This recording project dissertation consists of two 60 minute CDs of selected repertoire, accompanied by a substantial document of in-depth program notes. The recordings are documented on compact discs that are housed within the University of Maryland Library System.
Resumo:
The spiking activity of nearby cortical neurons is correlated on both short and long time scales. Understanding this shared variability in firing patterns is critical for appreciating the representation of sensory stimuli in ensembles of neurons, the coincident influences of neurons on common targets, and the functional implications of microcircuitry. Our knowledge about neuronal correlations, however, derives largely from experiments that used different recording methods, analysis techniques, and cortical regions. Here we studied the structure of neuronal correlation in area V4 of alert macaques using recording and analysis procedures designed to match those used previously in primary visual cortex (V1), the major input to V4. We found that the spatial and temporal properties of correlations in V4 were remarkably similar to those of V1, with two notable differences: correlated variability in V4 was approximately one-third the magnitude of that in V1 and synchrony in V4 was less temporally precise than in V1. In both areas, spontaneous activity (measured during fixation while viewing a blank screen) was approximately twice as correlated as visual-evoked activity. The results provide a foundation for understanding how the structure of neuronal correlation differs among brain regions and stages in cortical processing and suggest that it is likely governed by features of neuronal circuits that are shared across the visual cortex.
Resumo:
One way we keep track of our movements is by monitoring corollary discharges or internal copies of movement commands. This study tested a hypothesis that the pathway from superior colliculus (SC) to mediodorsal thalamus (MD) to frontal eye field (FEF) carries a corollary discharge about saccades made into the contralateral visual field. We inactivated the MD relay node with muscimol in monkeys and measured corollary discharge deficits using a double-step task: two sequential saccades were made to the locations of briefly flashed targets. To make second saccades correctly, monkeys had to internally monitor their first saccades; therefore deficits in the corollary discharge representation of first saccades should disrupt second saccades. We found, first, that monkeys seemed to misjudge the amplitudes of their first saccades; this was revealed by systematic shifts in second saccade end points. Thus corollary discharge accuracy was impaired. Second, monkeys were less able to detect trial-by-trial variations in their first saccades; this was revealed by reduced compensatory changes in second saccade angles. Thus corollary discharge precision also was impaired. Both deficits occurred only when first saccades went into the contralateral visual field. Single-saccade generation was unaffected. Additional deficits occurred in reaction time and overall performance, but these were bilateral. We conclude that the SC-MD-FEF pathway conveys a corollary discharge used for coordinating sequential saccades and possibly for stabilizing vision across saccades. This pathway is the first elucidated in what may be a multilevel chain of corollary discharge circuits extending from the extraocular motoneurons up into cerebral cortex.
Resumo:
In this paper we present different ways used by Secondary students to generalize when they try to solve problems involving sequences. 359 Spanish students solved generalization problems in a written test. These problems were posed through particular terms expressed in different representations. We present examples that illustrate different ways of achieving various types of generalization and how students express generalization. We identify graphical representation of generalization as a useful tool of getting other ways of expressing generalization, and we analyze its connection with other ways of expressing it.
Resumo:
This paper describes the application of computational fluid dynamics (CFD) to simulate the macroscopic bulk motion of solder paste ahead of a moving squeegee blade in the stencil printing process during the manufacture of electronic components. The successful outcome of the stencil printing process is dependent on the interaction of numerous process parameters. A better understanding of these parameters is required to determine their relation to print quality and improve guidelines for process optimization. Various modelling techniques have arisen to analyse the flow behaviour of solder paste, including macroscopic studies of the whole mass of paste as well as microstructural analyses of the motion of individual solder particles suspended in the carrier fluid. This work builds on the knowledge gained to date from earlier analytical models and CFD investigations by considering the important non-Newtonian rheological properties of solder pastes which have been neglected in previous macroscopic studies. Pressure and velocity distributions are obtained from both Newtonian and non-Newtonian CFD simulations and evaluated against each other as well as existing established analytical models. Significant differences between the results are observed, which demonstrate the importance of modelling non-Newtonian properties for realistic representation of the flow behaviour of solder paste.
Resumo:
In this paper, we discuss the problem of maintenance of a CBR system for retrieval of rotationally symmetric shapes. The special feature of this system is that similarity is derived primarily from graph matching algorithms. The special problem of such a system is that it does not operate on search indices that may be derived from single cases and then used for visualisation and principle component analyses. Rather, the system is built on a similarity metric defined directly over pairs of cases. The problems of efficiency, consistency, redundancy, completeness and correctness are discussed for such a system. Performance measures for the CBR system are given, and the results for trials of the system are presented. The competence of the current case-base is discussed, with reference to a representation of cases as points in an n-dimensional feature space, and a Gramian visualisation. A refinement of the case base is performed as a result of the competence analysis and the performance of the case-base before and after refinement is compared.
Resumo:
The aim of this paper is to develop a mathematical model with the ability to predict particle degradation during dilute phase pneumatic conveying. A numerical procedure, based on a matrix representation of degradation processes, is presented to determine the particle impact degradation propensity from a small number of particle single impact tests carried out in a new designed laboratory scale degradation tester. A complete model of particle degradation during dilute phase pneumatic conveying is then described, where the calculation of degradation propensity is coupled with a flow model of the solids and gas phases in the pipeline. Numerical results are presented for degradation of granulated sugar in an industrial scale pneumatic conveyor.
Resumo:
In this paper, a Computational Fluid Dynamics framework is presented for the modelling of key processes which involve granular material (i.e. segregation, degradation, caking). Appropriate physical models and sophisticated algorithms have been developed for the correct representation of the different material components in a granular mixture. The various processes, which arise from the micromechanical properties of the different mixture species can be obtained and parametrised in a DEM / experimental framework, thus enabling the continuum theory to correctly account for the micromechanical properties of a granular system. The present study establishes the link between the micromechanics and continuum theory and demonstrates the model capabilities in simulations of processes which are of great importance to the process engineering industry and involve granular materials in complex geometries.
Resumo:
A complete model of particle impact degradation during dilute-phase pneumatic conveying is developed, which combines a degradation model, based on the experimental determination of breakage matrices, and a physical model of solids and gas flow in the pipeline. The solids flow in a straight pipe element is represented by a model consisting of two zones: a strand-type flow zone immediately downstream of a bend, followed by a fully suspended flow region after dispersion of the strand. The breakage matrices constructed from data on 90° angle single-impact tests are shown to give a good representation of the degradation occurring in a pipe bend of 90° angle. Numerical results are presented for degradation of granulated sugar in a large scale pneumatic conveyor.