937 resultados para Editor of flow analysis methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To test discriminant analysis as a method of turning the information of a routine customer satisfaction survey (CSS) into a more accurate decision-making tool. METHODS: A 7-question, 10-multiple choice, self-applied questionnaire was used to study a sample of patients seen in two outpatient care units in Valparaíso, Chile, one of primary care (n=100) and the other of secondary care (n=249). Two cutting points were considered in the dependent variable (final satisfaction score): satisfied versus unsatisfied, and very satisfied versus all others. Results were compared with empirical measures (proportion of satisfied individuals, proportion of unsatisfied individuals and size of the median). RESULTS: The response rate was very high, over 97.0% in both units. A new variable, medical attention, was revealed, as explaining satisfaction at the primary care unit. The proportion of the total variability explained by the model was very high (over 99.4%) in both units, when comparing satisfied with unsatisfied customers. In the analysis of very satisfied versus all other customers, significant relationship was identified only in the case of the primary care unit, which explained a small proportion of the variability (41.9%). CONCLUSIONS: Discriminant analysis identified relationships not revealed by the previous analysis. It provided information about the proportion of the variability explained by the model. It identified non-significant relationships suggested by empirical analysis (e.g. the case of the relation very satisfied versus others in the secondary care unit). It measured the contribution of each independent variable to the explanation of the variation of the dependent one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The epilepsies associated with the tuberous sclerosis complex (TSC) are very often refractory to medical therapy. Surgery for epilepsy is an effective alternative when the critical link between the localization of seizure onset in the scalp and a particular cortical tuber can be established. In this study we perform analysis of ictal and interictal EEG to improve such link. Methods: The ictal and interictal recordings of four patients with TSC undergoing surgery for epilepsy were submitted to independent component analysis (ICA), followed by source analysis, using the sLORETA algorithm. The localizations obtained for the ictal EEG and for the average interictal spikes were compared. Results: The ICA of ictal EEG produced consistent results in different events, and there was good agreement with the tubers that were successfully removed in three of the four patients (one patient refused surgery). In some patients there was a large discrepancy between the localization of ictal and interictal sources. The interictal activity produced more widespread source localizations. Conclusions: The use of ICA of ictal EEG followed by the use of source analysis methods in four cases of epilepsy and TSC was able to localize the epileptic generators very near the lesions successfully removed in surgery for epilepsy. Significance: The ICA of ictal EEG events may be a useful add-on to the tools used to establish the connection between epileptic scalp activity and the cortical tubers originating it, in patients with TSC considered for surgery of epilepsy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper continues a discussion of approaches and methodologies we have used in our studies of feeding in haematophagous insects. Described are techniques for directly monitoring behaviour: electrical recording of feeding behaviour via resistance changes in the food canal, optical methods for monitoring mouthpart activity, and a computer technique for behavioural event recording. Also described is the use of "flow charts" or "decision diagrams" to model interrelated sequences of behaviours.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this report we present a concise review concerning the use of flow cytometric methods to characterize and differentiate between two different mechanisms of cell death, apoptosis and necrosis. The applications of these techniques to clinical and basic research are also considered. The following cell features are useful to characterize the mode of cell death: (1) activation of an endonuclease in apoptotic cells results in extraction of the low molecular weight DNA following cell permeabilization, which, in turn, leads to their decreased stainability with DNA-specific fluorochromes. Measurements of DNA content make it possible to identify apoptotic cells and to recognize the cell cycle phase specificity of apoptotic process; (2) plasma membrane integrity, which is lost in necrotic but not in apoptotic cells; (3) the decrease in forward light scatter, paralleled either by no change or an increase in side scatter, represent early changes during apoptosis. The data presented indicate that flow cytometry can be applied to basic research of the molecular and biochemical mechanisms of apoptosis, as well as in the clinical situations, where the ability to monitor early signs of apoptosis in some systems may be predictive for the outcome of some treatment protocols.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A total of 106 women with vaginitis in Nicaragua were studied. The positive rate for the identification of Candida species was 41% (44 positive cultures out of 106 women with vaginitis). The sensitivity of microscopic examination of wet mount with the potassium hydroxide (KOH) was 61% and 70% with Gram's stain when using the culture of vaginal fluid as gold standard for diagnosis of candidiasis. Among the 44 positives cultures, isolated species of yeast from vaginal swabs were C. albicans (59%), C. tropicalis (23%), C. glabrata (14%) and C. krusei (4%). This study reports the first characterization of 26 C. albicans stocks from Nicaragua by the random amplified polymorphic DNA method. The genetic analysis in this small C. albicans population showed the existence of linkage disequilibrium, which is consistent with the hypothesis that C. albicans undergoes a clonal propagation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study analysed the concordance among four different molecular diagnostic methods for tuberculosis (TB) in pulmonary and blood samples from immunocompromised patients. A total of 165 blood and 194 sputum samples were collected from 181 human immunodeficiency virus (HIV)-infected patients with upper respiratory complaints, regardless of suspicious for TB. The samples were submitted for smear microscopy, culture and molecular tests: a laboratory-developed conventional polymerase chain reaction (PCR) and real-time quantitative PCR (qPCR) and the Gen-Probe and Detect-TB Ampligenix kits. The samples were handled blindly by all the technicians involved, from sample processing to results analysis. For sputum, the sensitivity and specificity were 100% and 96.7% for qPCR, 81.8% and 94.5% for Gen-Probe and 100% and 66.3% for Detect-TB, respectively. qPCR presented the best concordance with sputum culture [kappa (k) = 0.864)], followed by Gen-Probe (k = 0.682). For blood samples, qPCR showed 100% sensitivity and 92.3% specificity, with a substantial correlation with sputum culture (k = 0.754) and with the qPCR results obtained from sputum of the corresponding patient (k = 0.630). Conventional PCR demonstrated the worst results for sputa and blood, with a sensitivity of 100% vs. 88.9% and a specificity of 46.3% vs. 32%, respectively. Commercial or laboratory-developed molecular assays can overcome the difficulties in the diagnosis of TB in paucibacillary patients using conventional methods available in most laboratories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We evaluated 25 protocol variants of 14 independent computational methods for exon identification, transcript reconstruction and expression-level quantification from RNA-seq data. Our results show that most algorithms are able to identify discrete transcript components with high success rates but that assembly of complete isoform structures poses a major challenge even when all constituent elements are identified. Expression-level estimates also varied widely across methods, even when based on similar transcript models. Consequently, the complexity of higher eukaryotic genomes imposes severe limitations on transcript recall and splice product discrimination that are likely to remain limiting factors for the analysis of current-generation RNA-seq data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of correspondence analysis to square asymmetrictables is often unsuccessful because of the strong role played by thediagonal entries of the matrix, obscuring the data off the diagonal. A simplemodification of the centering of the matrix, coupled with the correspondingchange in row and column masses and row and column metrics, allows the tableto be decomposed into symmetric and skew--symmetric components, which canthen be analyzed separately. The symmetric and skew--symmetric analyses canbe performed using a simple correspondence analysis program if the data areset up in a special block format.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Practical guidelines for monitoring and measuring compounds such as jasmonates, ketols, ketodi(tri)enes and hydroxy-fatty acids as well as detecting the presence of novel oxylipins are presented. Additionally, a protocol for the penetrant analysis of non-enzymatic lipid oxidation is described. Each of the methods, which employ gas chromatography/mass spectrometry, can be applied without specialist knowledge or recourse to the latest analytical instrumentation. Additional information on oxylipin quantification and novel protocols for preparing oxygen isotope-labelled internal standards are provided. Four developing areas of research are identified: (i) profiling of the unbound cellular pools of oxylipins; (ii) profiling of esterified oxylipins and/or monitoring of their release from parent lipids; (iii) monitoring of non-enzymatic lipid oxidation; (iv) analysis of unstable and reactive oxylipins. The methods and protocols presented herein are designed to give technical insights into the first three areas and to provide a platform from which to enter the fourth area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Protein-energy malnutrition is highly prevalent in aged populations. Associated clinical, economic, and social burden is important. A valid screening method that would be robust and precise, but also easy, simple, and rapid to apply, is essential for adequate therapeutic management. OBJECTIVES: To compare the interobserver variability of 2 methods measuring food intake: semiquantitative visual estimations made by nurses versus calorie measurements performed by dieticians on the basis of standardized color digital photographs of servings before and after consumption. DESIGN: Observational monocentric pilot study. SETTING/PARTICIPANTS: A geriatric ward. The meals were randomly chosen from the meal tray. The choice was anonymous with respect to the patients who consumed them. MEASUREMENTS: The test method consisted of the estimation of calorie consumption by dieticians on the basis of standardized color digital photographs of servings before and after consumption. The reference method was based on direct visual estimations of the meals by nurses. Food intake was expressed in the form of a percentage of the serving consumed and calorie intake was then calculated by a dietician based on these percentages. The methods were applied with no previous training of the observers. Analysis of variance was performed to compare their interobserver variability. RESULTS: Of 15 meals consumed and initially examined, 6 were assessed with each method. Servings not consumed at all (0% consumption) or entirely consumed by the patient (100% consumption) were not included in the analysis so as to avoid systematic error. The digital photography method showed higher interobserver variability in calorie intake estimations. The difference between the compared methods was statistically significant (P < .03). CONCLUSIONS: Calorie intake measures for geriatric patients are more concordant when estimated in a semiquantitative way. Digital photography for food intake estimation without previous specific training of dieticians should not be considered as a reference method in geriatric settings, as it shows no advantages in terms of interobserver variability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The soil water available to crops is defined by specific values of water potential limits. Underlying the estimation of hydro-physical limits, identified as permanent wilting point (PWP) and field capacity (FC), is the selection of a suitable method based on a multi-criteria analysis that is not always clear and defined. In this kind of analysis, the time required for measurements must be taken into consideration as well as other external measurement factors, e.g., the reliability and suitability of the study area, measurement uncertainty, cost, effort and labour invested. In this paper, the efficiency of different methods for determining hydro-physical limits is evaluated by using indices that allow for the calculation of efficiency in terms of effort and cost. The analysis evaluates both direct determination methods (pressure plate - PP and water activity meter - WAM) and indirect estimation methods (pedotransfer functions - PTFs). The PTFs must be validated for the area of interest before use, but the time and cost associated with this validation are not included in the cost of analysis. Compared to the other methods, the combined use of PP and WAM to determine hydro-physical limits differs significantly in time and cost required and quality of information. For direct methods, increasing sample size significantly reduces cost and time. This paper assesses the effectiveness of combining a general analysis based on efficiency indices and more specific analyses based on the different influencing factors, which were considered separately so as not to mask potential benefits or drawbacks that are not evidenced in efficiency estimation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: In May 2010, Switzerland introduced a heterogeneous smoking ban in the hospitality sector. While the law leaves room for exceptions in some cantons, it is comprehensive in others. This longitudinal study uses different measurement methods to examine airborne nicotine levels in hospitality venues and the level of personal exposure of non-smoking hospitality workers before and after implementation of the law. METHODS: Personal exposure to second hand smoke (SHS) was measured by three different methods. We compared a passive sampler called MoNIC (Monitor of NICotine) badge, to salivary cotinine and nicotine concentration as well as questionnaire data. Badges allowed the number of passively smoked cigarettes to be estimated. They were placed at the venues as well as distributed to the participants for personal measurements. To assess personal exposure at work, a time-weighted average of the workplace badge measurements was calculated. RESULTS: Prior to the ban, smoke-exposed hospitality venues yielded a mean badge value of 4.48 (95%-CI: 3.7 to 5.25; n = 214) cigarette equivalents/day. At follow-up, measurements in venues that had implemented a smoking ban significantly declined to an average of 0.31 (0.17 to 0.45; n = 37) (p = 0.001). Personal badge measurements also significantly decreased from an average of 2.18 (1.31-3.05 n = 53) to 0.25 (0.13-0.36; n = 41) (p = 0.001). Spearman rank correlations between badge exposure measures and salivary measures were small to moderate (0.3 at maximum). CONCLUSIONS: Nicotine levels significantly decreased in all types of hospitality venues after implementation of the smoking ban. In-depth analyses demonstrated that a time-weighted average of the workplace badge measurements represented typical personal SHS exposure at work more reliably than personal exposure measures such as salivary cotinine and nicotine.