943 resultados para Problems of Computer Intellectualization
Resumo:
Cyclodextrins (CDs) are annular oligosaccharides containing 6-12 glucose unities joined together by alpha-1,4 bonds. They have a conical-truncated shape with a lipophilic cavity in which different molecules can be included resulting in a stable inclusion complex. The cyclodextrins have been widely applied in pharmaceutical technology with the objective of increasing the solubility, stability and bioavailability of drugs in different pharmaceutical dosage forms, such as tablets. In order to obtain beta-CD tablets, liquid dispersions of drug/beta-CD are usually submitted to different drying processes, like spray-drying, freeze-drying or slow evaporation, being this dry material added to a number of excipients. However, such drying processes can generate particulate materials showing problems of flow and compressibility, needing their conversion into granulates by means of wetting with granulation liquid followed by additional drying. In this work, the main objective was to evaluate the preparation of tablets without the need of this additional drying step. For this purpose an aqueous dispersion containing acetaminophen/beta-CD complex and cornstarch was dried using a spouted bed and the obtained granules were compressed in tablets. Acetaminophen was used as model drug due to its low water solubility and the inexpensive and widely available cornstarch was chosen as excipient. Acetaminophen powder was added into a beta-cyclodextrin solution prepared in distilled water at 70 degrees C. Stirring was kept until this dispersion cooled to room temperature. Then cornstarch was added and the resulting dispersion was dried in spouted bed equipment. This material was compressed into tablets using an Erweka Korsh EKO tablet machine. This innovative approach allowed the tablets preparation process to be carried out with fewer steps and represents a technological reliable strategy to produce beta-cyclodextrin inclusion complexes tablets. (C) 2010 Elsevier By. All rights reserved.
Resumo:
The demand for more pixels is beginning to be met as manufacturers increase the native resolution of projector chips. Tiling several projectors still offers a solution to augment the pixel capacity of a display. However, problems of color and illumination uniformity across projectors need to be addressed as well as the computer software required to drive such devices. We present the results obtained on a desktop-size tiled projector array of three D-ILA projectors sharing a common illumination source. A short throw lens (0.8:1) on each projector yields a 21-in. diagonal for each image tile; the composite image on a 3×1 array is 3840×1024 pixels with a resolution of about 80 dpi. The system preserves desktop resolution, is compact, and can fit in a normal room or laboratory. The projectors are mounted on precision six-axis positioners, which allow pixel level alignment. A fiber optic beamsplitting system and a single set of red, green, and blue dichroic filters are the key to color and illumination uniformity. The D-ILA chips inside each projector can be adjusted separately to set or change characteristics such as contrast, brightness, or gamma curves. The projectors were then matched carefully: photometric variations were corrected, leading to a seamless image. Photometric measurements were performed to characterize the display and are reported here. This system is driven by a small PC cluster fitted with graphics cards and running Linux. It can be scaled to accommodate an array of 2×3 or 3×3 projectors, thus increasing the number of pixels of the final image. Finally, we present current uses of the display in fields such as astrophysics and archaeology (remote sensing).
Resumo:
The BR algorithm is a novel and efficient method to find all eigenvalues of upper Hessenberg matrices and has never been applied to eigenanalysis for power system small signal stability. This paper analyzes differences between the BR and the QR algorithms with performance comparison in terms of CPU time based on stopping criteria and storage requirement. The BR algorithm utilizes accelerating strategies to improve its performance when computing eigenvalues of narrowly banded, nearly tridiagonal upper Hessenberg matrices. These strategies significantly reduce the computation time at a reasonable level of precision. Compared with the QR algorithm, the BR algorithm requires fewer iteration steps and less storage space without depriving of appropriate precision in solving eigenvalue problems of large-scale power systems. Numerical examples demonstrate the efficiency of the BR algorithm in pursuing eigenanalysis tasks of 39-, 68-, 115-, 300-, and 600-bus systems. Experiment results suggest that the BR algorithm is a more efficient algorithm for large-scale power system small signal stability eigenanalysis.
Resumo:
This paper reviews the potential use of three types of spatial technology to land managers, namely satellite imagery, satellite positioning systems and supporting computer software. Developments in remote sensing and the relative advantages of multispectral and hyperspectral images are discussed. The main challenge to the wider use of remote sensing as a land management tool is seen as uncertainty whether apparent relationships between biophysical variables and spectral reflectance are direct and causal, or artefacts of particular images. Developments in satellite positioning systems are presented in the context of land managers’ need for position estimates in situations where absolute precision may or may not be required. The role of computer software in supporting developments in spatial technology is described. Spatial technologies are seen as having matured beyond empirical applications to the stage where they are useful and reliable land management tools. In addition, computer software has become more user-friendly and this has facilitated data collection and manipulation by semi-expert as well as specialist staff.
Resumo:
The debate about the dynamics and potential policy responses to asset inflation has intensified in recent years. Some analysts, notably Borio and Lowe, have called for 'subtle' changes to existing monetary targeting frameworks to try to deal with the problems of asset inflation and have attempted to developed indicators of financial vulnerability to aid this process. In contrast, this paper argues that the uncertainties involved in understanding financial market developments and their potential impact on the real economy are likely to remain too high to embolden policy makers. The political and institutional risks associated with policy errors are also significant. The fundamental premise that a liberalised financial system is based on 'efficient' market allocation cannot be overlooked. The corollary is that any serious attempt to stabilize financial market outcomes must involve at least a partial reversal of deregulation.
Resumo:
Every day trillions of dollars circulate the globe in a digital data space and new forms of property and ownership emerge. Massive corporate entities with a global reach are formed and disappear with breathtaking speed, making and breaking personal fortunes the size of which defy imagination. Fictitious commodities abound. The genomes of entire nations have become corporately owned. Relationships have become the overt basis of economic wealth and political power. Hypercapitalism explores the problems of understanding this emergent form of global political economic organization by focusing on the internal relations between language, new media networks, and social perceptions of value. Taking an historical approach informed by Marx, Phil Graham draws upon writings in political economy, media studies, sociolinguistics, anthropology, and critical social science to understand the development, roots, and trajectory of the global system in which every possible aspect of human existence, including imagined futures, has become a commodity form.
Resumo:
alpha-Conotoxin MII, a 16-residue polypeptide from the venom of the piscivorous cone snail Conus magus, is a potent and highly specific blocker of mammalian neuronal nicotinic acetylcholine receptors composed of alpha 3 beta 2 subunits. The role of this receptor type in the modulation of neurotransmitter release and its relevance to the problems of addiction and psychosis emphasize the importance of a structural understanding of the mode of interaction of MII with the alpha 3 beta 2 interface. Here we describe the three-dimensional solution structure of MIT determined using 2D H-1 NMR spectroscopy. Structural restraints consisting of 376 interproton distances inferred from NOEs and 12 dihedral restraints derived from spin-spin coupling constants were used as input for simulated annealing calculations and energy minimization in the program X-PLOR. The final set of 20 structures is exceptionally well-defined with mean pairwise rms differences over the whole molecule of 0.07 Angstrom for the backbone atoms and 0.34 Angstrom for all heavy atoms. MII adopts a compact structure incorporating a central segment of alpha-helix and beta-turns at the N- and C-termini. The molecule is stabilized by two disulfide bonds, which provide cross-links between the N-terminus and both the middle and C-terminus of the structure. The susceptibility of the structure to conformational change was examined using several different solvent conditions. While the global fold of MII remains the same, the structure is stabilized in a more hydrophobic environment provided by the addition of acetonitrile or trifluoroethanol to the aqueous solution. The distribution of amino acid side chains in MII creates distinct hydrophobic and polar patches on its surface that may be important for the specific interaction with the alpha 3 beta 2 neuronal nAChR. A comparison of the structure of MII with other neuronal-specific alpha-conotoxins provides insights into their mode of interaction with these receptors.
Resumo:
Computer models can be combined with laboratory experiments for the efficient determination of (i) peptides that bind MHC molecules and (ii) T-cell epitopes. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures. This requires the definition of standards and experimental protocols for model application. We describe the requirements for validation and assessment of computer models. The utility of combining accurate predictions with a limited number of laboratory experiments is illustrated by practical examples. These include the identification of T-cell epitopes from IDDM-, melanoma- and malaria-related antigens by combining computational and conventional laboratory assays. The success rate in determining antigenic peptides, each in the context of a specific HLA molecule, ranged from 27 to 71%, while the natural prevalence of MHC-binding peptides is 0.1-5%.
Resumo:
The occurrence of foliated rock masses is common in mining environment. Methods employing continuum approximation in describing the deformation of such rock masses possess a clear advantage over methods where each rock layer and each inter-layer interface (joint) is explicitly modelled. In devising such a continuum model it is imperative that moment (couple) stresses and internal rotations associated with the bending of the rock layers be properly incorporated in the model formulation. Such an approach will lead to a Cosserat-type theory. In the present model, the behaviour of the intact rock layer is assumed to be linearly elastic and the joints are assumed to be elastic-perfectly plastic. Condition of slip at the interfaces are determined by a Mohr-Coulomb criterion with tension cut off at zero normal stress. The theory is valid for large deformations. The model is incorporated into the finite element program AFENA and validated against an analytical solution of elementary buckling problems of a layered medium under gravity loading. A design chart suitable for assessing the stability of slopes in foliated rock masses against flexural buckling failure has been developed. The design chart is easy to use and provides a quick estimate of critical loading factors for slopes in foliated rock masses. It is shown that the model based on Euler's buckling theory as proposed by Cavers (Rock Mechanics and Rock Engineering 1981; 14:87-104) substantially overestimates the critical heights for a vertical slope and underestimates the same for sub-vertical slopes. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
Environmental processes have been modelled for decades. However. the need for integrated assessment and modeling (IAM) has,town as the extent and severity of environmental problems in the 21st Century worsens. The scale of IAM is not restricted to the global level as in climate change models, but includes local and regional models of environmental problems. This paper discusses various definitions of IAM and identifies five different types of integration that Lire needed for the effective solution of environmental problems. The future is then depicted in the form of two brief scenarios: one optimistic and one pessimistic. The current state of IAM is then briefly reviewed. The issues of complexity and validation in IAM are recognised as more complex than in traditional disciplinary approaches. Communication is identified as a central issue both internally among team members and externally with decision-makers. stakeholders and other scientists. Finally it is concluded that the process of integrated assessment and modelling is considered as important as the product for any particular project. By learning to work together and recognise the contribution of all team members and participants, it is believed that we will have a strong scientific and social basis to address the environmental problems of the 21st Century. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
A number of theoretical and experimental investigations have been made into the nature of purlin-sheeting systems over the past 30 years. These systems commonly consist of cold-formed zed or channel section purlins, connected to corrugated sheeting. They have proven difficult to model due to the complexity of both the purlin deformation and the restraint provided to the purlin by the sheeting. Part 1 of this paper presented a non-linear elasto plastic finite element model which, by incorporating both the purlin and the sheeting in the analysis, allowed the interaction between the two components of the system to be modelled. This paper presents a simplified version of the first model which has considerably decreased requirements in terms of computer memory, running time and data preparation. The Simplified Model includes only the purlin but allows for the sheeting's shear and rotational restraints by modelling these effects as springs located at the purlin-sheeting connections. Two accompanying programs determine the stiffness of these springs numerically. As in the Full Model, the Simplified Model is able to account for the cross-sectional distortion of the purlin, the shear and rotational restraining effects of the sheeting, and failure of the purlin by local buckling or yielding. The model requires no experimental or empirical input and its validity is shown by its goon con elation with experimental results. (C) 1997 Elsevier Science Ltd.
Resumo:
Objective: To document outcome and to investigate patterns of physical and psychosocial recovery in the first year following severe traumatic brain injury (TBI) in an Australian patient sample. Design: A longitudinal prospective study of a cohort of patients, with data collection at 3, 6, 9, and 12 months post injury. Setting: A head injury rehabilitation unit in a large metropolitan public hospital. Patients: A sample of 55 patients selected from 120 consecutive admissions with severe TBI. Patients who were more than 3 months post injury on admission, who remained confused, or who had severe communication deficits or a previous neurologic disorder were excluded. Interventions: All subjects participated in a multidisciplinary inpatient rehabilitation program, followed by varied participation in outpatient rehabilitation and community-based sen ices. Main Outcome Measures: The Sickness impact Profile (SIP) provided physical, psychosocial, and total dysfunction scores at each follow-up. Outcome at 1 year was measured by the Disability Rating Scale. Results: Multivariate analysis of variance indicated that the linear trend of recovery over time was less for psychosocial dysfunction than for physical dysfunction (F(1,51) = 5.87, P < .02). One rear post injury, 22% of subjects had returned to their previous level of employability, and 42% were able to live independently. Conclusions: Recovery from TBI in this Australian sample followed a pattern similar to that observed in other countries, with psychosocial dysfunction being more persistent. Self-report measures such as the SIP in TBI research are limited by problems of diminished self-awareness.
Resumo:
The absence of considerations of technology in policy studies reinforces the popular notion that technology is a neutral tool, Through an analysis of the role played by computers in the policy processes of Australia's Department of Social Security, this paper argues that computers are political players in policy processes, Findings indicate that computers make aspects of the social domain knowable and therefore governable, The use of computers makes previously infeasible policies possible, Computers also operate as bureaucrats and as agents of client surveillance. Increased policy change, reduced discretion and increasingly targeted and complex policies can be attributed to the use of computer technology, If policy processes are to be adequately understood and analysed, then the role of technology in those processes must be considered.
Resumo:
This study examined the impact of computer and assistive device use on the employment status and vocational modes of people with physical disabilities in Australia. A survey was distributed to people over 15 years in age with physical disabilities living in the Brisbane area. Responses were received from 82 people, including those with spinal cord injuries, cerebral palsy and muscular dystrophy. Of respondents 46 were employed, 22 were unemployed, and 12 were either students or undertaking voluntary work. Three-quarters of respondents used a computer in their occupations, while 15 used assistive devices. Using logistic regression analysis it was found that gender, education, level of computer skill and computer training were significant predictors of employment outcomes. Neither the age of respondent nor use of assistive software were significant predictors. From information obtained in this study guidelines for a training programme designed to maximize the employability of people with physical disabilities were developed.