923 resultados para complex place-based initiatives
Fast Structure-Based Assignment of 15N HSQC Spectra of Selectively 15N-Labeled Paramagnetic Proteins
Resumo:
A novel strategy for fast NMR resonance assignment of N-15 HSQC spectra of proteins is presented. It requires the structure coordinates of the protein, a paramagnetic center, and one or more residue-selectively N-15-labeled samples. Comparison of sensitive undecoupled N-15 HSQC spectra recorded of paramagnetic and diamagnetic samples yields data for every cross-peak on pseudocontact shift, paramagnetic relaxation enhancement, cross-correlation between Curie-spin and dipole-dipole relaxation, and residual dipolar coupling. Comparison of these four different paramagnetic quantities with predictions from the three-dimensional structure simultaneously yields the resonance assignment and the anisotropy of the susceptibility tensor of the paramagnetic center. The method is demonstrated with the 30 kDa complex between the N-terminal domain of the epsilon subunit and the theta subunit of Escherichia Coll DNA polymerase III. The program PLATYPUS was developed to perform the assignment, provide a measure of reliability of the assignment, and determine the susceptibility tensor anisotropy.
Resumo:
The binuclear complex [NBu4n](4)[Cr-2(ox)(5)]. 2CHCl(3) has been prepared by an ion-exchange procedure employing Dowex 50WX2 cation-exchange resin in the n-butylammonium form and potassium tris(oxalato)chromate(III). The dimeric complex was characterised by a crystal structure determination: monoclinic, space group C2/c, a = 29.241(7), b = 15.192(2), c = 22.026(5) Angstrom, beta = 94.07(1)degrees, Z = 4. The magnetic susceptibility (300-4.2 K) indicated that the chromium(III) sites were antiferromagnetically coupled (J = -3.1 cm(-1)).
Resumo:
The notion of salience was developed by Schelling in the context of the meeting-place problem of locating a partner in the absence of a pre-agreed meeting place. In this paper, we argue that a realistic specification of the meeting place problem involves allowing a strategy of active search over a range of possible meeting places. We solve this extended problem, allowing for extensions such as repeated play, search costs and asymmetric payoffs. The result is a considerably richer, but more complex, notion of salience. (C) 1998 Elsevier Science B.V.
Resumo:
CD4-selective targeting of an antibody-polycation-DNA complex was investigated The complex was synthesized with the anti-CD4 monoclonal antibody B-F5, polylysine(268) (pLL) and either the pGL3 control vector containing the luciferase reporter gene or the pGeneGrip vector containing the green fluorescent protein (GFP) gene. B-F5-pLL-DNA complexes inhibited the binding of I-125-B-F5 to CD4(+) Jurkat cells, while complexes synthesised either without B-F5 or using a non-specific mouse IgG1 antibody had little or no effect Expression of the luciferase reporter gene was achieved in Jurkat cells using the B-F5-pLL-pGL3 complex and was enhanced in the presence of PMA. Negligible luciferase activity was defected with the non-specific antibody complex in Jurkat cells or with the B-F5-pLL-pGL3 complex in the CD4(-) K-562 cells. Using complexes synthesised with the pGeneGrip vector, the transfection efficiency in Jurkat and K-562 cells was examined using confocal microscopy. More than 95% of Jurkat cells expressed GFP and the level of this expression was markedly enhanced by PMA. Negligible GFP expression was seen in K-562 cells or when B-F5 was replaced by a nonspecific antibody. Using flow cytometry, fluorescein-labelled complex showed specific targeting to CD4(+) cells in a mixed cell population from human peripheral blood. These studies demonstrate the selective transfection of CD4(+) T-lymphoid cells using a polycation-based gene delivery system. The complex may provide a means of delivering anti-HIV gene therapies to CD4(+) cells in vivo.
Resumo:
The World Wide Web (WWW) is useful for distributing scientific data. Most existing web data resources organize their information either in structured flat files or relational databases with basic retrieval capabilities. For databases with one or a few simple relations, these approaches are successful, but they can be cumbersome when there is a data model involving multiple relations between complex data. We believe that knowledge-based resources offer a solution in these cases. Knowledge bases have explicit declarations of the concepts in the domain, along with the relations between them. They are usually organized hierarchically, and provide a global data model with a controlled vocabulary, We have created the OWEB architecture for building online scientific data resources using knowledge bases. OWEB provides a shell for structuring data, providing secure and shared access, and creating computational modules for processing and displaying data. In this paper, we describe the translation of the online immunological database MHCPEP into an OWEB system called MHCWeb. This effort involved building a conceptual model for the data, creating a controlled terminology for the legal values for different types of data, and then translating the original data into the new structure. The 0 WEB environment allows for flexible access to the data by both users and computer programs.
Resumo:
In this paper we propose a new framework for evaluating designs based on work domain analysis, the first phase of cognitive work analysis. We develop a rationale for a new approach to evaluation by describing the unique characteristics of complex systems and by showing that systems engineering techniques only partially accommodate these characteristics. We then present work domain analysis as a complementary framework for evaluation. We explain this technique by example by showing how the Australian Defence Force used work domain analysis to evaluate design proposals for a new system called Airborne Early Warning and Control. This case study also demonstrates that work domain analysis is a useful and feasible approach that complements standard techniques for evaluation and that promotes a central role for human factors professionals early in the system design and development process. Actual or potential applications of this research include the evaluation of designs for complex systems.
Resumo:
This paper presents an agent-based approach to modelling individual driver behaviour under the influence of real-time traffic information. The driver behaviour models developed in this study are based on a behavioural survey of drivers which was conducted on a congested commuting corridor in Brisbane, Australia. Commuters' responses to travel information were analysed and a number of discrete choice models were developed to determine the factors influencing drivers' behaviour and their propensity to change route and adjust travel patterns. Based on the results obtained from the behavioural survey, the agent behaviour parameters which define driver characteristics, knowledge and preferences were identified and their values determined. A case study implementing a simple agent-based route choice decision model within a microscopic traffic simulation tool is also presented. Driver-vehicle units (DVUs) were modelled as autonomous software components that can each be assigned a set of goals to achieve and a database of knowledge comprising certain beliefs, intentions and preferences concerning the driving task. Each DVU provided route choice decision-making capabilities, based on perception of its environment, that were similar to the described intentions of the driver it represented. The case study clearly demonstrated the feasibility of the approach and the potential to develop more complex driver behavioural dynamics based on the belief-desire-intention agent architecture. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper describes the kinematics and muscle activity associated with the standard sit-up, as a first step in the investigation of complex motor coordination. Eight normal human subjects lay on a force table and performed at least 15 sit-ups, with the arms across the chest and the legs straight and unconstrained. Several subjects also performed sit-ups with an additional weight added to the head. Support surface forces were recorded to calculate the location of the center of pressure and center of gravity; conventional motion analysis was used to measure segmental positions; and surface EMG was recorded from eight muscles. While the sit-up consists of two serial components, 'trunk curling' and 'footward pelvic rotation', it can be further subdivided into five phases, based on the kinematics. Phases I and II comprise trunk curling. Phase I consists of neck and upper trunk flexion, and phase II consists of lumbar trunk lifting. Phase II corresponds to the point of peak muscle contraction and maximum postural instability, the 'critical point' of the sit-up. Phases III-V comprise footward pelvic rotation. Phase III begins with pelvic rotation towards the feet. phase W with leg lowering, and phase V with contact between the legs and the support surface. The overall pattern of muscle activity was complex with times of EMG onset, peak activity, offset, and duration differing for different muscles. This complex pattern changed qualitatively from one phase to the next, suggesting that the roles of different muscles and, as a consequence, the overall form of coordination, change during the sit-up. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
We present a technique for team design based on cognitive work analysis (CWA). We first develop a rationale for this technique by discussing the limitations of conventional approaches for team design in light of the special characteristics of first-of-a-kind, complex systems. We then introduce the CWA-based technique for team design and provide a case study of how we used this technique to design a team for a first-of-a-kind, complex military system during the early stages of its development. In addition to illustrating the CWA-based technique by example, the case study allows us to evaluate the technique. This case study demonstrates that the CWA-based technique for team design is both feasible and useful, although empirical validation of the technique is still necessary. Applications of this work include the design of teams for first-of-a-kind, complex systems in military, medical, and industrial domains.
Resumo:
Spatial data has now been used extensively in the Web environment, providing online customized maps and supporting map-based applications. The full potential of Web-based spatial applications, however, has yet to be achieved due to performance issues related to the large sizes and high complexity of spatial data. In this paper, we introduce a multiresolution approach to spatial data management and query processing such that the database server can choose spatial data at the right resolution level for different Web applications. One highly desirable property of the proposed approach is that the server-side processing cost and network traffic can be reduced when the level of resolution required by applications are low. Another advantage is that our approach pushes complex multiresolution structures and algorithms into the spatial database engine. That is, the developer of spatial Web applications needs not to be concerned with such complexity. This paper explains the basic idea, technical feasibility and applications of multiresolution spatial databases.
Resumo:
The commercially available Jacobsen catalyst, Mn(salen), was occluded in hybrid polymeric membranes based on poly(dimethylsiloxane) (PDMS) and poly(vinyl alcohol) (PVA). The obtained systems were characterized by UV-vis spectroscopy and SEM techniques. The membranes were used as a catalytic barrier between two different phases: an organic substrate phase (cyclooctene or styrene) in the absence of solvent, and an aqueous solution of either t-BuOOH or H(2)O(2). Membranes containing different percentages of PVA were prepared, in order to modulate their hydrophilic/hydrophobic swelling properties. The occluded complex proved to be an efficient catalyst for the oxidation of alkenes. The new triphasic system containing a cheap and easily available catalyst allowed substrate oxidation and easy product separation using ""green"" oxidants. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Burkholderia cepacia complex isolates obtained by microbiological culture of respiratory samples from Brazilian CF patients were studied by recA based PCR, screened by specific PCR for virulence markers and genotyped by RAPD. Forty-one isolates of B. cepacia complex were identified by culture and confirmation of identity and genomovar determination obtained in 32 isolates, with predominance of B. cenocepacia (53.1%). Virulence markers were not consistently found among isolates. Genotyping did not identify identical patterns among different patients. B. cenocepacia was the most prevalent B. cepacia complex member among our patients, and cross-infection does not seem to occur among them. V 2008 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.
Resumo:
Preoperative progressive pneumoperitoneum (PPP) is a safe and effective procedure in the treatment of large incisional hernia (size > 10 cm in width or length) with loss of domain (LIHLD). There is no consensus in the literature on the amount of gas that must be insufflated in a PPP program or even how long it should be maintained. We describe a technique for calculating the hernia sac volume (HSV) and abdominal cavity volume (ACV) based on abdominal computerized tomography (ACT) scanning that eliminates the need for subjective criteria for inclusion in a PPP program and shows the amount of gas that must be insufflated into the abdominal cavity in the PPP program. Our technique is indicated for all patients with large or recurrent incisional hernias evaluated by a senior surgeon with suspected LIHLD. We reviewed our experience from 2001 to 2008 of 23 consecutive hernia surgical procedures of LIHLD undergoing preoperative evaluation with CT scanning and PPP. An ACT was required in all patients with suspected LIHLD in order to determine HSV and ACV. The PPP was performed only if the volume ratio HSV/ACV (VR = HSV/ACV) was a parts per thousand yen25% (VR a parts per thousand yen 25%). We have performed this procedure on 23 patients, with a mean age of 55.6 years (range 31-83). There were 16 women and 7 men with an average age of 55.6 years (range 31-83), and a mean BMI of 38.5 kg/m(2) (range 23-55.2). Almost all patients (21 of 23 patients-91.30%) were overweight; 43.5% (10 patients) were severely obese (obese class III). The mean calculated volumes for ACV and HSV were 9,410 ml (range 6,060-19,230 ml) and 4,500 ml (range 1,850-6,600 ml), respectively. The PPP is performed by permanent catheter placed in a minor surgical procedure. The total amount of CO(2) insufflated ranged from 2,000 to 7,000 ml (mean 4,000 ml). Patients required a mean of 10 PPP sessions (range 4-18) to achieve the desired volume of gas (that is the same volume that was calculated for the hernia sac). Since PPP sessions were performed once a day, 4-18 days were needed for preoperative preparation with PPP. The mean VR was 36% (ranged from 26 to 73%). We conclude that ACT provides objective data for volume calculation of both hernia sac and abdominal cavity and also for estimation of the volume of gas that should be insufflated into the abdominal cavity in PPP.
Resumo:
The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.
Resumo:
We suggest a new notion of behaviour preserving transition refinement based on partial order semantics. This notion is called transition refinement. We introduced transition refinement for elementary (low-level) Petri Nets earlier. For modelling and verifying complex distributed algorithms, high-level (Algebraic) Petri nets are usually used. In this paper, we define transition refinement for Algebraic Petri Nets. This notion is more powerful than transition refinement for elementary Petri nets because it corresponds to the simultaneous refinement of several transitions in an elementary Petri net. Transition refinement is particularly suitable for refinement steps that increase the degree of distribution of an algorithm, e.g. when synchronous communication is replaced by asynchronous message passing. We study how to prove that a replacement of a transition is a transition refinement.