984 resultados para cutting tools


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: HIV-1 clade C (HIV-C) predominates worldwide, and anti-HIV-C vaccines are urgently needed. Neutralizing antibody (nAb) responses are considered important but have proved difficult to elicit. Although some current immunogens elicit antibodies that neutralize highly neutralization-sensitive (tier 1) HIV strains, most circulating HIVs exhibiting a less sensitive (tier 2) phenotype are not neutralized. Thus, both tier 1 and 2 viruses are needed for vaccine discovery in nonhuman primate models. METHODOLOGY/PRINCIPAL FINDINGS: We constructed a tier 1 simian-human immunodeficiency virus, SHIV-1157ipEL, by inserting an "early," recently transmitted HIV-C env into the SHIV-1157ipd3N4 backbone [1] encoding a "late" form of the same env, which had evolved in a SHIV-infected rhesus monkey (RM) with AIDS. SHIV-1157ipEL was rapidly passaged to yield SHIV-1157ipEL-p, which remained exclusively R5-tropic and had a tier 1 phenotype, in contrast to "late" SHIV-1157ipd3N4 (tier 2). After 5 weekly low-dose intrarectal exposures, SHIV-1157ipEL-p systemically infected 16 out of 17 RM with high peak viral RNA loads and depleted gut CD4+ T cells. SHIV-1157ipEL-p and SHIV-1157ipd3N4 env genes diverge mostly in V1/V2. Molecular modeling revealed a possible mechanism for the increased neutralization resistance of SHIV-1157ipd3N4 Env: V2 loops hindering access to the CD4 binding site, shown experimentally with nAb b12. Similar mutations have been linked to decreased neutralization sensitivity in HIV-C strains isolated from humans over time, indicating parallel HIV-C Env evolution in humans and RM. CONCLUSIONS/SIGNIFICANCE: SHIV-1157ipEL-p, the first tier 1 R5 clade C SHIV, and SHIV-1157ipd3N4, its tier 2 counterpart, represent biologically relevant tools for anti-HIV-C vaccine development in primates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most powerful analytical tools used in the social sciences are well suited for studying static situations. Static and mechanistic analysis, however, is not adequate to understand the changing world in which we live. In order to adequately address the most pressing social and environmental challenges looming ahead, we need to develop analytical tools for analyzing dynamic situations -particularly institutional change. In this paper, we develop an analytical tool to study institutional change, more specifically, the evolution of rules and norms. We believe that in order for such an analytical tool to be useful to develop a general theory of institutional change, it needs to enable the analyst to concisely record the processes of change in multiple specific settings so that lessons from such settings can eventually be integrated into a more general predictive theory of change. Copyright © The JOIE Foundation 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Duke Medicine utilized interprofessional case conferences (ICCs) from 2008-2012 with the objective of modeling and facilitating development of teamwork skills among diverse health profession students, including physical therapy, physician assistant, medical doctor and nursing. The purpose of this publication was to describe the operational process used to develop and implement the ICCs and measure the success of the ICCs in order to shape future work. The ICCs were offered to develop skills and attitudes essential for participation in healthcare teams. Students were facilitated by faculty of different professions to conduct a comprehensive historical assessment of a standardized patient (SP), determine pertinent physical and lab assessments to undertake, and develop and share a comprehensive management plan. Cases included patient problems that were authentic and relevant to each professional student in attendance. The main barriers to implementation are outlined and the focus on the process of working together is highlighted. Evaluation showed high satisfaction rates among participants and the outcomes from these experiences are presented. The limitations of these results are discussed and recommendations for future assessment are emphasized. The ICCs demonstrated that students will come together voluntarily to learn in teams, even at a research-focused institution, and express benefit from the collaborative exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context : Stress fractures are one of the most common injuries in sports, accounting for approximately 10% of all overuse injuries. Treatment of fifth metatarsal stress fractures involves both surgical and nonsurgical interventions. Fifth metatarsal stress fractures are difficult to treat because of the risks of delayed union, nonunion, and recurrent injuries. Most of these injuries occur during agility tasks, such as those performed in soccer, basketball, and lacrosse. Objective : To examine the effect of a rigid carbon graphite footplate on plantar loading during 2 agility tasks. Design :  Crossover study. Setting : Laboratory. Patients or Other Participants : A total of 19 recreational male athletes with no history of lower extremity injury in the past 6 months and no previous metatarsal stress fractures were tested. Main Outcome Measure(s) :  Seven 45° side-cut and crossover-cut tasks were completed in a shoe with or without a full-length rigid carbon plate. Testing order between the shoe conditions and the 2 cutting tasks was randomized. Plantar-loading data were recorded using instrumented insoles. Peak pressure, maximum force, force-time integral, and contact area beneath the total foot, the medial and lateral midfoot, and the medial, middle, and lateral forefoot were analyzed. A series of paired t tests was used to examine differences between the footwear conditions (carbon graphite footplate, shod) for both cutting tasks independently (α = .05). Results : During the side-cut task, the footplate increased total foot and lateral midfoot peak pressures while decreasing contact area and lateral midfoot force-time integral. During the crossover-cut task, the footplate increased total foot and lateral midfoot peak pressure and lateral forefoot force-time integral while decreasing total and lateral forefoot contact area. Conclusions : Although a rigid carbon graphite footplate altered some aspects of the plantar- pressure profile during cutting in uninjured participants, it was ineffective in reducing plantar loading beneath the fifth metatarsal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asymmetries in sagittal plane knee kinetics have been identified as a risk factor for anterior cruciate ligament (ACL) re-injury. Clinical tools are needed to identify the asymmetries. This study examined the relationships between knee kinetic asymmetries and ground reaction force (GRF) asymmetries during athletic tasks in adolescent patients following ACL reconstruction (ACL-R). Kinematic and GRF data were collected during a stop-jump task and a side-cutting task for 23 patients. Asymmetry indices between the surgical and non-surgical limbs were calculated for GRF and knee kinetic variables. For the stop-jump task, knee kinetics asymmetry indices were correlated with all GRF asymmetry indices (P < 0.05), except for loading rate. Vertical GRF impulse asymmetry index predicted peak knee moment, average knee moment, and knee work (R(2)  ≥ 0.78, P < 0.01) asymmetry indices. For the side-cutting tasks, knee kinetic asymmetry indices were correlated with the peak propulsion vertical GRF and vertical GRF impulse asymmetry indices (P < 0.05). Vertical GRF impulse asymmetry index predicted peak knee moment, average knee moment, and knee work (R(2)  ≥ 0.55, P < 0.01) asymmetry indices. The vertical GRF asymmetries may be a viable surrogate for knee kinetic asymmetries and therefore may assist in optimizing rehabilitation outcomes and minimizing re-injury rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated which promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests that behavioral signs can be observed late in the first year of life. Many of these studies involve extensive frame-by-frame video observation and analysis of a child's natural behavior. Although nonintrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are burdensome for clinical and large population research purposes. This work is a first milestone in a long-term project on non-invasive early observation of children in order to aid in risk detection and research of neurodevelopmental disorders. We focus on providing low-cost computer vision tools to measure and identify ASD behavioral signs based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure responses to general ASD risk assessment tasks and activities outlined by the AOSI which assess visual attention by tracking facial features. We show results, including comparisons with expert and nonexpert clinicians, which demonstrate that the proposed computer vision tools can capture critical behavioral observations and potentially augment the clinician's behavioral observations obtained from real in-clinic assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated that promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests behavioral markers can be observed late in the first year of life. Many of these studies involved extensive frame-by-frame video observation and analysis of a child's natural behavior. Although non-intrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are impractical for clinical and large population research purposes. Diagnostic measures for ASD are available for infants but are only accurate when used by specialists experienced in early diagnosis. This work is a first milestone in a long-term multidisciplinary project that aims at helping clinicians and general practitioners accomplish this early detection/measurement task automatically. We focus on providing computer vision tools to measure and identify ASD behavioral markers based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure three critical AOSI activities that assess visual attention. We augment these AOSI activities with an additional test that analyzes asymmetrical patterns in unsupported gait. The first set of algorithms involves assessing head motion by tracking facial features, while the gait analysis relies on joint foreground segmentation and 2D body pose estimation in video. We show results that provide insightful knowledge to augment the clinician's behavioral observations obtained from real in-clinic assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stimulated CD4(+) T lymphocytes can differentiate into effector T cell (Teff) or inducible regulatory T cell (Treg) subsets with specific immunological roles. We show that Teff and Treg require distinct metabolic programs to support these functions. Th1, Th2, and Th17 cells expressed high surface levels of the glucose transporter Glut1 and were highly glycolytic. Treg, in contrast, expressed low levels of Glut1 and had high lipid oxidation rates. Consistent with glycolysis and lipid oxidation promoting Teff and Treg, respectively, Teff were selectively increased in Glut1 transgenic mice and reliant on glucose metabolism, whereas Treg had activated AMP-activated protein kinase and were dependent on lipid oxidation. Importantly, AMP-activated protein kinase stimulation was sufficient to decrease Glut1 and increase Treg generation in an asthma model. These data demonstrate that CD4(+) T cell subsets require distinct metabolic programs that can be manipulated in vivo to control Treg and Teff development in inflammatory diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hepatitis delta virus (HDV) ribozyme is a self-cleaving RNA enzyme essential for processing viral transcripts during rolling circle viral replication. The first crystal structure of the cleaved ribozyme was solved in 1998, followed by structures of uncleaved, mutant-inhibited and ion-complexed forms. Recently, methods have been developed that make the task of modeling RNA structure and dynamics significantly easier and more reliable. We have used ERRASER and PHENIX to rebuild and re-refine the cleaved and cis-acting C75U-inhibited structures of the HDV ribozyme. The results correct local conformations and identify alternates for RNA residues, many in functionally important regions, leading to improved R values and model validation statistics for both structures. We compare the rebuilt structures to a higher resolution, trans-acting deoxy-inhibited structure of the ribozyme, and conclude that although both inhibited structures are consistent with the currently accepted hammerhead-like mechanism of cleavage, they do not add direct structural evidence to the biochemical and modeling data. However, the rebuilt structures (PDBs: 4PR6, 4PRF) provide a more robust starting point for research on the dynamics and catalytic mechanism of the HDV ribozyme and demonstrate the power of new techniques to make significant improvements in RNA structures that impact biologically relevant conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer Aided Parallelisation Tools (CAPTools) is a toolkit designed to automate as much as possible of the process of parallelising scalar FORTRAN 77 codes. The toolkit combines a very powerful dependence analysis together with user supplied knowledge to build an extremely comprehensive and accurate dependence graph. The initial version has been targeted at structured mesh computational mechanics codes (eg. heat transfer, Computational Fluid Dynamics (CFD)) and the associated simple mesh decomposition paradigm is utilised in the automatic code partition, execution control mask generation and communication call insertion. In this, the first of a series of papers [1–3] the authors discuss the parallelisations of a number of case study codes showing how the various component tools may be used to develop a highly efficient parallel implementation in a few hours or days. The details of the parallelisation of the TEAMKE1 CFD code are described together with the results of three other numerical codes. The resulting parallel implementations are then tested on workstation clusters using PVM and an i860-based parallel system showing efficiencies well over 80%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

User supplied knowledge and interaction is a vital component of a toolkit for producing high quality parallel implementations of scalar FORTRAN numerical code. In this paper we consider the necessary components that such a parallelisation toolkit should possess to provide an effective environment to identify, extract and embed user relevant user knowledge. We also examine to what extent these facilities are available in leading parallelisation tools; in particular we discuss how these issues have been addressed in the development of the user interface of the Computer Aided Parallelisation Tools (CAPTools). The CAPTools environment has been designed to enable user exploration, interaction and insertion of user knowledge to facilitate the automatic generation of very efficient parallel code. A key issue in the user's interaction is control of the volume of information so that the user is focused on only that which is needed. User control over the level and extent of information revealed at any phase is supplied using a wide variety of filters. Another issue is the way in which information is communicated. Dependence analysis and its resulting graphs involve a lot of sophisticated rather abstract concepts unlikely to be familiar to most users of parallelising tools. As such, considerable effort has been made to communicate with the user in terms that they will understand. These features, amongst others, and their use in the parallelisation process are described and their effectiveness discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The consecutive, partly overlapping emergence of expert systems and then neural computation methods among intelligent technologies, is reflected in the evolving scene of their application to nuclear engineering. This paper provides a bird's eye view of the state of the application in the domain, along with a review of a particular task, the one perhaps economically more important: refueling design in nuclear power reactors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Temperature distributions involved in some metal-cutting or surface-milling processes may be obtained by solving a non-linear inverse problem. A two-level concept on parallelism is introduced to compute such temperature distribution. The primary level is based on a problem-partitioning concept driven by the nature and properties of the non-linear inverse problem. Such partitioning results to a coarse-grained parallel algorithm. A simplified 2-D metal-cutting process is used as an example to illustrate the concept. A secondary level exploitation of further parallel properties based on the concept of domain-data parallelism is explained and implemented using MPI. Some experiments were performed on a network of loosely coupled machines consist of SUN Sparc Classic workstations and a network of tightly coupled processors, namely the Origin 2000.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Production Workstation developed at the University of Greenwich is evaluated as a tool for assisting all those concerned with production. It enables the producer, director, and cinematographer to explore the quality of the images obtainable when using a plethora of tools. Users are free to explore many possible choices, ranging from 35mm to DV, and combine them with the many image manipulation tools of the cinematographer. The validation required for the system is explicitly examined, concerning the accuracy of the resulting imagery. Copyright © 1999 by the Society of Motion Picture and Television Engineers, Inc.