994 resultados para simple algorithms
Resumo:
A robust semi-implicit central partial difference algorithm for the numerical solution of coupled stochastic parabolic partial differential equations (PDEs) is described. This can be used for calculating correlation functions of systems of interacting stochastic fields. Such field equations can arise in the description of Hamiltonian and open systems in the physics of nonlinear processes, and may include multiplicative noise sources. The algorithm can be used for studying the properties of nonlinear quantum or classical field theories. The general approach is outlined and applied to a specific example, namely the quantum statistical fluctuations of ultra-short optical pulses in chi((2)) parametric waveguides. This example uses a non-diagonal coherent state representation, and correctly predicts the sub-shot noise level spectral fluctuations observed in homodyne detection measurements. It is expected that the methods used wilt be applicable for higher-order correlation functions and other physical problems as well. A stochastic differencing technique for reducing sampling errors is also introduced. This involves solving nonlinear stochastic parabolic PDEs in combination with a reference process, which uses the Wigner representation in the example presented here. A computer implementation on MIMD parallel architectures is discussed. (C) 1997 Academic Press.
Resumo:
Objective: The study we assessed how often patients who are manifesting a myocardial infarction (MI) would not be considered candidates for intensive lipid-lowering therapy based on the current guidelines. Methods: In 355 consecutive patients manifesting ST elevation MI (STEMI), admission plasma C-reactive protein (CRP) was measured and Framingham risk score (FRS), PROCAM risk score, Reynolds risk score, ASSIGN risk score, QRISK, and SCORE algorithms were applied. Cardiac computed tomography and carotid ultrasound were performed to assess the coronary artery calcium score (CAC), carotid intima-media thickness (cIMT) and the presence of carotid plaques. Results: Less than 50% of STEMI patients would be identified as having high risk before the event by any of these algorithms. With the exception of FRS (9%), all other algorithms would assign low risk to about half of the enrolled patients. Plasma CRP was <1.0 mg/L in 70% and >2 mg/L in 14% of the patients. The average cIMT was 0.8 +/- 0.2 mm and only in 24% of patients was >= 1.0 mm. Carotid plaques were found in 74% of patients. CAC > 100 was found in 66% of patients. Adding CAC >100 plus the presence of carotid plaque, a high-risk condition would be identified in 100% of the patients using any of the above mentioned algorithms. Conclusion: More than half of patients manifesting STEMI would not be considered as candidates for intensive preventive therapy by the current clinical algorithms. The addition of anatomical parameters such as CAC and the presence of carotid plaques can substantially reduce the CVD risk underestimation. (C) 2010 Elsevier Ireland Ltd. All rights reserved.
Resumo:
A risk score model was developed based in a population of 1,224 individuals from the general population without known diabetes aging 35 years or more from an urban Brazilian population sample in order to select individuals who should be screened in subsequent testing and improve the efficacy of public health assurance. External validation was performed in a second, independent, population from a different city ascertained through a similar epidemiological protocol. The risk score was developed by multiple logistic regression and model performance and cutoff values were derived from a receiver operating characteristic curve. Model`s capacity of predicting fasting blood glucose levels was tested analyzing data from a 5-year follow-up protocol conducted in the general population. Items independently and significantly associated with diabetes were age, BMI and known hypertension. Sensitivity, specificity and proportion of further testing necessary for the best cutoff value were 75.9, 66.9 and 37.2%, respectively. External validation confirmed the model`s adequacy (AUC equal to 0.72). Finally, model score was also capable of predicting fasting blood glucose progression in non-diabetic individuals in a 5-year follow-up period. In conclusion, this simple diabetes risk score was able to identify individuals with an increased likelihood of having diabetes and it can be used to stratify subpopulations in which performing of subsequent tests is necessary and probably cost-effective.
Resumo:
Microsatellites or simple sequence repeats (SSRs) are ubiquitous in eukaryotic genomes. Single-locus SSR markers have been developed for a number of species, although there is a major bottleneck in developing SSR markers whereby flanking sequences must be known to design 5'-anchors for polymerase chain reaction (PCR) primers. Inter SSR (ISSR) fingerprinting was developed such that no sequence knowledge was required. Primers based on a repeat sequence, such as (CA)(n), can be made with a degenerate 3'-anchor, such as (CA)(8)RG or (AGC)(6)TY. The resultant PCR reaction amplifies the sequence between two SSRs, yielding a multilocus marker system useful for fingerprinting, diversity analysis and genome mapping. PCR products are radiolabelled with P-32 or P-33 via end-labelling or PCR incorporation, and separated on a polyacrylamide sequencing gel prior to autoradiographic visualisation. A typical reaction yields 20-100 bands per lane depending on the species and primer. We have used ISSR fingerprinting in a number of plant species, and report here some results on two important tropical species, sorghum and banana. Previous investigators have demonstrated that ISSR analysis usually detects a higher level of polymorphism than that detected with restriction fragment length polymorphism (RFLP) or random amplified polymorphic DNA (RAPD) analyses. Our data indicate that this is not a result of greater polymorphism genetically, but rather technical reasons related to the detection methodology used for ISSR analysis.
Resumo:
The concept of parameter-space size adjustment is pn,posed in order to enable successful application of genetic algorithms to continuous optimization problems. Performance of genetic algorithms with six different combinations of selection and reproduction mechanisms, with and without parameter-space size adjustment, were severely tested on eleven multiminima test functions. An algorithm with the best performance was employed for the determination of the model parameters of the optical constants of Pt, Ni and Cr.
Resumo:
Background: Cannabis is the most used illicit drug in the world, and its use has been associated with prefrontal cortex (PFC) dysfunction, including deficits in executive functions (EF). Considering that EF may influence treatment outcome, it would be interesting to have a brief neuropsychological battery to assess EF in chronic cannabis users (CCU). In the present study, the Frontal Assessment Battery (FAB), a brief, easy to use neuropsychological instrument aimed to evaluate EF, was used to evaluate cognitive functioning of CCU. Methods: We evaluated 107 abstinent CCU with the FAB and compared with 44 controls matched for age, estimated IQ, and years of education. Results: CCU performed poorly as compared to controls (FAB total score = 16.53 vs. 17.09, p .05). CCU had also a poor performance in the Motor Programming subtest (2.47 vs. 2.73, p .05). Conclusion: This study examined effects of cannabis in executive functioning and showed evidence that the FAB is sensitive to detect EF deficits in early abstinent chronic cannabis users. Clinical significance of these findings remains to be investigated in further longitudinal studies. FAB may be useful as a screening instrument to evaluate the necessity for a complete neuropsychological assessment in this population.
Resumo:
We suggest a new notion of behaviour preserving transition refinement based on partial order semantics. This notion is called transition refinement. We introduced transition refinement for elementary (low-level) Petri Nets earlier. For modelling and verifying complex distributed algorithms, high-level (Algebraic) Petri nets are usually used. In this paper, we define transition refinement for Algebraic Petri Nets. This notion is more powerful than transition refinement for elementary Petri nets because it corresponds to the simultaneous refinement of several transitions in an elementary Petri net. Transition refinement is particularly suitable for refinement steps that increase the degree of distribution of an algorithm, e.g. when synchronous communication is replaced by asynchronous message passing. We study how to prove that a replacement of a transition is a transition refinement.
Resumo:
Experience with laparoscopic liver resections has increased in recent years, and so have the number of patients operated on by minimally invasive techniques. Specimen extraction is an important step of laparoscopic liver resection. The size of the specimen, is Usually a limitation for the use Of laparoscopy. The aim of this paper is to describe a new technique combining Pfannenstiel suprapubic incision and obstetric forceps to remove a large specimen from laparoscopic liver resections. The present technique allows an expeditious extraction of intact specimens, even huge ones, through a standard suprapubic Pfannenstiel incision. This technique has additional functional and cosmetic advantages over other techniques of specimen retrieval. We believe that the described technique is feasible, can be easily and rapidly performed, and facilitates laparoscopic liver resection by reducing the technical difficulties for specimen removal and may also be used in other abdominal laparoscopic interventions that deal with large surgical specimens.
Resumo:
Aim: To demonstrate that the evaluation of erythrocyte dysmorphism by light microscopy with lowering of the condenser lens (LMLC) is useful to identify patients with a haematuria of glomerular or non-glomerular origin. Methods: A comparative double-blind study between phase contrast microscopy (PCM) and LMLC is reported to evaluate the efficacy of these techniques. Urine samples of 39 patients followed up for 9 months were analyzed, and classified as glomerular and non-glomerular haematuria. The different microscopic techniques were compared using receiver-operator curve (ROC) analysis and area under curve (AUC). Reproducibility was assessed by coefficient of variation (CV). Results: Specific cut-offs were set for each method according to their best rate of specificity and sensitivity as follows: 30% for phase contrast microscopy and 40% for standard LMLC, reaching in the first method the rate of 95% and 100% of sensitivity and specificity, respectively, and in the second method the rate of 90% and 100% of sensitivity and specificity, respectively. In ROC analysis, AUC for PCM was 0.99 and AUC for LMLC was 0.96. The CV was very similar in glomerular haematuria group for PCM (35%) and LMLC (35.3%). Conclusion: LMLC proved to be effective in contributing to the direction of investigation of haematuria, toward the nephrological or urological side. This method can substitute PCM when this equipment is not available.
Resumo:
Described in this article is a novel device that facilitates study of the cross-sectional anatomy of the human head. In designing our device, we aimed to protect sections of the head from the destructive action of handling during anatomy laboratory while also ensuring excellent visualization of the anatomic structures. We used an electric saw to create 15-mm sections of three cadaver heads in the three traditional anatomic planes and inserted each section into a thin, perforated display box made of transparent acrylic material. The thin display boxes with head sections are kept in anatomical order in a larger transparent acrylic storage box containing formaldehyde solution, which preserves the specimens but also permits direct observation of the structures and their anatomic relationships to each other. This box-within-box design allows students to easily view sections of a head in its anatomical position as well as to examine internal structures by manipulating individual display boxes without altering the integrity of the preparations. This methodology for demonstrating cross-section anatomy allows efficient use of cadaveric material and technician time while also giving learners the best possible handling and visualization of complex anatomic structures. Our approach to teaching cross-sectional anatomy of the head can be applied to any part of human body, and the value of our device design will only increase as more complicated understandings of cross-sectional anatomy are required by advances and proliferation of imaging technology. Anat Sci Educ 3: 141-143, 2010. (C) 2010 American Association of Anatomists.