948 resultados para soft computing methods
Resumo:
Managing financial institutions in an underdeveloped economic context has become a real challenge nowadays. In order to reach the organization`s planned goals, they have to deal with structural, behavioral and informational problems. From the systemic point of view, this situation gets even worse when the company does not present organizational boundaries and a cohesive identification for their stakeholders. Thus, European countries have some special financial lines in order to help the development of micro credit in Latin communities in an attempt to help the local economy. However, institutions like Caixa dos Andes in Peru present management problems when dealing with this complexity. Based on this, how can the systemic eye help in the diagnosis of soft problems of a Peruvian financial company? This study aims to diagnose soft problems of a Peruvian financial company based on soft variables like identity, communication and autonomy and also intends to identify possible ways to redesign its basic framework. The (VSM--Viable System Model) method from Beer (1967), applied in this diagnostic study, was used in a practical way as a management tool for organizations` analysis and planning. By describing the VSM`s five systems, the creation of a systemic vision or a total vision is possible, showing the organization`s complexity from the inside. Some company`s soft problems like double control, inefficient use of physical and human resources, low information flows, slowness, etc. The VSM presented an organizational diagnosis indicating effective solutions that do integrate its five systems.
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
We present a scheme which offers a significant reduction in the resources required to implement linear optics quantum computing. The scheme is a variation of the proposal of Knill, Laflamme and Milburn, and makes use of an incremental approach to the error encoding to boost probability of success.
Resumo:
Analytical and bioanalytical methods of high-performance liquid chromatography with fluorescence detection (HPLC-FLD) were developed and validated for the determination of chloroaluminum phthalocyanine in different formulations of polymeric nanocapsules, plasma and livers of mice. Plasma and homogenized liver samples were extracted with ethyl acetate, and zinc phthalocyanine was used as internal standard. The results indicated that the methods were linear and selective for all matrices studied. Analysis of accuracy and precision showed adequate values, with variations lower than 10% in biological samples and lower than 2% in analytical samples. The recoveries were as high as 96% and 99% in the plasma and livers, respectively. The quantification limit of the analytical method was 1.12 ng/ml, and the limits of quantification of the bioanalytical method were 15 ng/ml and 75 ng/g for plasma and liver samples, respectively. The bioanalytical method developed was sensitive in the ranges of 15-100 ng/ml in plasma and 75-500 ng/g in liver samples and was applied to studies of biodistribution and pharmacokinetics of AlClPc. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
New differential linear coherent scattering coefficient, mu(CS), data for four biological tissue types (fat pork, tendon chicken, adipose and fibroglandular human breast tissues) covering a large momentum transfer interval (0.07 <= q <= 70.5 nm(-1)), resulted from combining WAXS and SAXS data, are presented in order to emphasize the need to update the default data-base by including the molecular interference and the large-scale arrangements effect. The results showed that the differential linear coherent scattering coefficient demonstrates influence of the large-scale arrangement, mainly due to collagen fibrils for tendon chicken and fibroglandular breast samples, and triacylglycerides for fat pork and adipose breast samples at low momentum transfer region. While, at high momentum transfer, the mu(CS) reflects effects of molecular interference related to water for tendon chicken and fibroglandular samples and, fatty acids for fat pork and adipose samples. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
UV-VIS-Spectrophotometric and spectrofluorimetric methods have been developed and validated allowing the quantification of chloroaluminum phthalocyanine (CIAIPc) in nanocarriers. In order to validate the methods, the linearity, limit of detection (LOD), limit of quantification (LOQ), precision, accuracy, and selectivity were examined according to USP 30 and ICH guidelines. Linearities range were found between 0.50-3.00 mu g.mL(-1) (Y=0.3829 X [CIAIPc, mu g.mL(-1)] + 0.0126; r=0.9992) for spectrophotometry, and 0.05-1.00 mu g.mL(-1) (Y=2.24 x 10(6) X [CIAIPc, mu g.L(-1)] + 9.74 x 10(4); r=0.9978) for spectrofluorimetry. In addition, ANOVA and Lack-of-fit tests demonstrated that the regression equations were statistically significant (p<0.05), and the resulting linear model is fully adequate for both analytical methods. The LOD values were 0.09 and 0.01 mu g.mL(-1), while the LOCI were 0.27 and 0.04 mu g.mL(-1) for spectrophotometric and spectrofluorimetric methods, respectively. Repeatability and intermediate precision for proposed methods showed relative standard deviation (RSD) between 0.58% to 4.80%. The percent recovery ranged from 98.9% to 102.7% for spectrophotometric analyses and from 94.2% to 101.2% for spectrofluorimetry. No interferences from common excipients were detected and both methods were considered specific. Therefore, the methods are accurate, precise, specific, and reproducible and hence can be applied for quantification of CIAIPc in nanoemulsions (NE) and nanocapsules (NC).
Resumo:
Introduction: This study was designed to examine the effect of masticatory hypofunction and estrogen deficiency on mandible bone mass and compare this site with spine and femoral bone. Methods: Twenty-four rats were ovariectomized (OVX) or Sham-operated (Sham) and analyzed after feeding with hard diet (Hard) or soft diet (Soft). They were divided into four groups: (GI)Sham-Hard; (GII)OVX-Hard; (GIII)Sham-Soft and (GIV)OVX-Soft. Bone mineral density (BMD) was measured in the spine and femur in the baseline and at the end of the study, and Delta BMD (final BMD - baseline BMD) was calculated. In mandible bone, BMD and histomorphometry were analyzed at the end of the experiment. Results: Sham rats showed higher spine (GI: 13.5%vs GII: 0.74%, P < 0.01; GIII: 10.67%vs GIV: -4.36%, P < 0.001) and femur Delta BMD (GI: 14.43%vs GII: 4.42%, P < 0.01; GIII: 10.58%vs GIV: 0.49%, P < 0.001) than OVX, but no difference was observed in mandible BMD among these groups (P > 0.05). Soft-diet groups showed decreased mandible BMD compared with hard-diet groups (GIV vs GII, P < 0.01; GIII vs GI, P < 0.01). Similarly, mandibular condyle histomorphometry showed that soft-diet groups presented a significant decrease in trabecular thickness and volume (GIV vs GII, P < 0.05; GIII vs GI, P < 0.01) compared with hard diet. Conclusion: Our results suggest that mandibular bone loss resulted from decreased of mechanical loading during mastication, and was not affect by estrogen depletion.
Resumo:
The main problem with current approaches to quantum computing is the difficulty of establishing and maintaining entanglement. A Topological Quantum Computer (TQC) aims to overcome this by using different physical processes that are topological in nature and which are less susceptible to disturbance by the environment. In a (2+1)-dimensional system, pseudoparticles called anyons have statistics that fall somewhere between bosons and fermions. The exchange of two anyons, an effect called braiding from knot theory, can occur in two different ways. The quantum states corresponding to the two elementary braids constitute a two-state system allowing the definition of a computational basis. Quantum gates can be built up from patterns of braids and for quantum computing it is essential that the operator describing the braiding-the R-matrix-be described by a unitary operator. The physics of anyonic systems is governed by quantum groups, in particular the quasi-triangular Hopf algebras obtained from finite groups by the application of the Drinfeld quantum double construction. Their representation theory has been described in detail by Gould and Tsohantjis, and in this review article we relate the work of Gould to TQC schemes, particularly that of Kauffman.