932 resultados para Modified Direct Analysis Method
Resumo:
BACKGROUND: A phase I dose-escalation trial of transarterial chemoembolisation (TACE) with idarubicin-loaded beads was performed in cirrhotic patients with hepatocellular carcinoma (HCC). AIM: To estimate the maximum-tolerated dose (MTD) and to assess safety, efficacy, pharmacokinetics and quality of life. METHODS: Patients received a single TACE session with injection of 2 mL drug-eluting beads (DEBs; DC Bead 300-500 μm) loaded with idarubicin. The idarubicin dose was escalated according to a modified continuous reassessment method. MTD was defined as the dose level closest to that causing dose-limiting toxicity (DLT) in 20% of patients. RESULTS: Twenty-one patients were enrolled, including nine patients at 5 mg, six patients at 10 mg, and six patients at 15 mg. One patient at each dose level experienced DLT (acute myocardial infarction, hyperbilirubinaemia and elevated aspartate aminotransferase (AST) at 5-, 10- and 15-mg, respectively). The calculated MTD of idarubicin was 10 mg. The most frequent grade ≥3 adverse events were pain, elevated AST, elevated γ-glutamyltranspeptidase and thrombocytopenia. At 2 months, the objective response rate was 52% (complete response, 28%, and partial response, 24%) by modified Response Evaluation Criteria in Solid Tumours. The median time to progression was 12.1 months (95% CI 7.4 months - not reached); the median overall survival was 24.5 months (95% CI 14.7 months - not reached). Pharmacokinetic analysis demonstrated the ability of DEBs to release idarubicin slowly. CONCLUSIONS: Using drug-eluting beads, the maximum-tolerated dose of idarubicin was 10 mg per TACE session. Encouraging responses and median time to progression were observed. Further clinical investigations are warranted (NCT01040559).
Resumo:
The physical disector is a method of choice for estimating unbiased neuron numbers; nevertheless, calibration is needed to evaluate each counting method. The validity of this method can be assessed by comparing the estimated cell number with the true number determined by a direct counting method in serial sections. We reconstructed a 1/5 of rat lumbar dorsal root ganglia taken from two experimental conditions. From each ganglion, images of 200 adjacent semi-thin sections were used to reconstruct a volumetric dataset (stack of voxels). On these stacks the number of sensory neurons was estimated and counted respectively by physical disector and direct counting methods. Also, using the coordinates of nuclei from the direct counting, we simulate, by a Matlab program, disector pairs separated by increasing distances in a ganglion model. The comparison between the results of these approaches clearly demonstrates that the physical disector method provides a valid and reliable estimate of the number of sensory neurons only when the distance between the consecutive disector pairs is 60 microm or smaller. In these conditions the size of error between the results of physical disector and direct counting does not exceed 6%. In contrast when the distance between two pairs is larger than 60 microm (70-200 microm) the size of error increases rapidly to 27%. We conclude that the physical dissector method provides a reliable estimate of the number of rat sensory neurons only when the separating distance between the consecutive dissector pairs is no larger than 60 microm.
Resumo:
The feasibility of substituting fibercomposite (FC) (thermoset) pavement dowels for steel pavement dowels was investigated in this research project. Load transfer capacity, flexural capacity, and material properties were examined. The objectives of Part 1 of this final report included the shear behavior and strength deformations of FC dowel bars without aging. Part 2 will contain the aging effects. This model included the effects of modulus of elasticity for the pavement dowel and concrete, dowel diameter, subgrade stiffness, and concrete compressive strength. An experimental investigation was carried out to establish the modulus of dowel support which is an important parameter for the analysis of dowels. The experimental investigation included measured deflections, observed behavioral characteristics, and failure mode observations. An extensive study was performed on various shear testing procedures. A modified Iosipescu shear method was selected for the test procedure. Also, a special test frame was designed and fabricated for this procedure. The experimental values of modulus of support for shear and FC dowels were used for arriving at the critical stresses and deflections for the theoretical model developed. Different theoretical methods based on analyses suggested by Timoshenko, Friberg, Bradbury, and Westergaard were studied and a comprehensive theoretical model was developed. The fibercomposite dowels were found to provide strengths and behavioral characteristics that appear promising as a potential substitute for steel dowels.
Resumo:
BACKGROUND: Studies on hexaminolevulinate (HAL) cystoscopy report improved detection of bladder tumours. However, recent meta-analyses report conflicting effects on recurrence. OBJECTIVE: To assess available clinical data for blue light (BL) HAL cystoscopy on the detection of Ta/T1 and carcinoma in situ (CIS) tumours, and on tumour recurrence. DESIGN, SETTING, AND PARTICIPANTS: This meta-analysis reviewed raw data from prospective studies on 1345 patients with known or suspected non-muscle-invasive bladder cancer (NMIBC). INTERVENTION: A single application of HAL cystoscopy was used as an adjunct to white light (WL) cystoscopy. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: We studied the detection of NMIBC (intention to treat [ITT]: n=831; six studies) and recurrence (per protocol: n=634; three studies) up to 1 yr. DerSimonian and Laird's random-effects model was used to obtain pooled relative risks (RRs) and associated 95% confidence intervals (CIs) for outcomes for detection. RESULTS AND LIMITATIONS: BL cystoscopy detected significantly more Ta tumours (14.7%; p<0.001; odds ratio [OR]: 4.898; 95% CI, 1.937-12.390) and CIS lesions (40.8%; p<0.001; OR: 12.372; 95% CI, 6.343-24.133) than WL. There were 24.9% patients with at least one additional Ta/T1 tumour seen with BL (p<0.001), significant also in patients with primary (20.7%; p<0.001) and recurrent cancer (27.7%; p<0.001), and in patients at high risk (27.0%; p<0.001) and intermediate risk (35.7%; p=0.004). In 26.7% of patients, CIS was detected only by BL (p<0.001) and was also significant in patients with primary (28.0%; p<0.001) and recurrent cancer (25.0%; p<0.001). Recurrence rates up to 12 mo were significantly lower overall with BL, 34.5% versus 45.4% (p=0.006; RR: 0.761 [0.627-0.924]), and lower in patients with T1 or CIS (p=0.052; RR: 0.696 [0.482-1.003]), Ta (p=0.040; RR: 0.804 [0.653-0.991]), and in high-risk (p=0.050) and low-risk (p=0.029) subgroups. Some subgroups had too few patients to allow statistically meaningful analysis. Heterogeneity was minimised by the statistical analysis method used. CONCLUSIONS: This meta-analysis confirms that HAL BL cystoscopy significantly improves the detection of bladder tumours leading to a reduction of recurrence at 9-12 mo. The benefit is independent of the level of risk and is evident in patients with Ta, T1, CIS, primary, and recurrent cancer.
Resumo:
Tutkielman tavoitteena on tunnistaa Kaakkois-Suomen alueella olevat ohjelmistoyritysten tyypilliset yritysryhmät ja kuvata niiden toimintaa. Tunnistamalla alueen ohjelmistoyrityksille ominaiset piirteet työ antaa myös pohjaa tulevien kehityskohteiden löytämisessä ja kehitystoimenpiteiden kohdistamisessa useisiin samantyyppisiin yrityksiin. Työn taustaksi esitellään ohjelmistoalaa ja ohjelmistoliiketoiminnan malleja, joiden pohjalta muodostetaan viitekehys alueen ohjelmistoyritysten empiiriseen tarkasteluun. Empiriaosuudessa tarkastellaan työn teoreettisessa osiossa esitettyjen liiketoimintamallien toteutumista Kaakkois-Suomessa ja ryhmitellään alueen ohjelmistoyritykset erottelevimpien tekijöiden avulla. Tutkimus on luonteeltaan kvantitatiivinen kokonaistutkimus Kaakkois-Suomen ohjelmistoyrityksistä, ja tutkimusotteeltaan deskriptiivinen eli kuvaileva. Tutkimusaineisto perustui tutkimusryhmän suorittamaan strukturoituun haastatteluun, jossa haastateltiin kaikkiaan 58 ohjelmistoyrityksen vastuuhenkilöitä. Tutkimustulosten perusteella alueelta pystyttiin tunnistamaan neljä toimintatavoiltaan erilaista ohjelmistoliiketoiminnan perustyyppiä: asiakaslähtöiset toimijat (26 toimipaikkaa), räätälöijät (14 toimipaikkaa), integroijat (10 toimipaikkaa) ja tuotteistajat (8 toimipaikkaa). Tulokset osoittavat, että perinteisten ohjelmistoalan liiketoimintamallien kuvaukset ja niistä tehtävät yleistykset antavat hyvän lähtökohdan ohjelmistoyritysten tarkasteluun. Kuitenkin perinteisten ohjelmistoalan liiketoimintamallien antama näkökulma on liian rajoittunut,jos halutaan tarkastella syvällisemmin ohjelmistoyritysten liiketoimintalogiikkaa.
Resumo:
Introduction. Genetic epidemiology is focused on the study of the genetic causes that determine health and diseases in populations. To achieve this goal a common strategy is to explore differences in genetic variability between diseased and nondiseased individuals. Usual markers of genetic variability are single nucleotide polymorphisms (SNPs) which are changes in just one base in the genome. The usual statistical approach in genetic epidemiology study is a marginal analysis, where each SNP is analyzed separately for association with the phenotype. Motivation. It has been observed, that for common diseases the single-SNP analysis is not very powerful for detecting genetic causing variants. In this work, we consider Gene Set Analysis (GSA) as an alternative to standard marginal association approaches. GSA aims to assess the overall association of a set of genetic variants with a phenotype and has the potential to detect subtle effects of variants in a gene or a pathway that might be missed when assessed individually. Objective. We present a new optimized implementation of a pair of gene set analysis methodologies for analyze the individual evidence of SNPs in biological pathways. We perform a simulation study for exploring the power of the proposed methodologies in a set of scenarios with different number of causal SNPs under different effect sizes. In addition, we compare the results with the usual single-SNP analysis method. Moreover, we show the advantage of using the proposed gene set approaches in the context of an Alzheimer disease case-control study where we explore the Reelin signal pathway.
Resumo:
In order to elucidate the traditional classification of archaeological artefacts, a multielemental analytical method for characterisation of its micro and macro chemical constituents. combined with statistical multivariate analysis for classification, were used. Instrumental thermal neutron activation analysis, for elemental chemical determination, and three statistical methods: discriminant, cluster and modified cluster analysis were applied. The statistical results obtained for the samples from Iquiri, Quinari and Xapuri archaeological phases were in good agreement with the conventional archaeological classification. Iaco and Jacuru archaeological phase were not characterised as homogenous groups. Iquiri phase were the most distinct in relation to the other analysed groups. An homogeneous group for 54% collected samples at the Los Angeles site was also found, this could be characterised as a new archaeological phase.
Resumo:
A rapid, economical, reproducible, and simple direct spectrophotometric method was developed and validated for the assay of nitazoxanide in pharmaceutical formulations. Nitazoxanide concentration was estimated in water at 345 nm and pH 4.5. The method was suitable and validated for specificity, linearity, precision, and accuracy. There was no interference of the excipients in the determination of the active pharmaceutical ingredient. The proposed method was successfully applied in the determination of nitazoxanide in coated tablets and in powders for oral suspension. This method was compared to a previously developed and validated method for liquid chromatography to the same drug. There was no significative difference between these methods for nitazoxanide quantitation.
Resumo:
In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.
Resumo:
In Brazil, scientific research is carried out mainly at universities, where professors coordinate research projects with the active participation of undergraduate and graduate students. However, there is no formal program for the teaching/learning of the scientific method. The objective of the present study was to evaluate the comprehension of the scientific method by students of health sciences who participate in scientific projects in an academic research laboratory. An observational descriptive cross-sectional study was conducted using Edgar Morin complexity as theoretical reference. In a semi-structured interview, students were asked to solve an abstract logical puzzle - TanGram. The collected data were analyzed using the hermeneutic-dialectic analysis method proposed by Minayo and discussed in terms of the theoretical reference of complexity. The students’ concept of the scientific method is limited to participation in projects, stressing the execution of practical procedures as opposed to scientific thinking. The solving of the TanGram puzzle revealed that the students had difficulties in understanding questions and activities focused on subjects and their processes. Objective answers, even when dealing with personal issues, were also reflected on the students’ opinions about the characteristics of a successful researcher. Students’ difficulties concerning these issues may affect their scientific performance and result in poorly designed experiments. This is a preliminary study that should be extended to other centers of scientific research.
Resumo:
The DNA extraction is a critical step in Genetically Modified Organisms analysis based on real-time PCR. In this study, the CTAB and DNeasy methods provided good quality and quantity of DNA from the texturized soy protein, infant formula, and soy milk samples. Concerning the Certified Reference Material consisting of 5% Roundup Ready® soybean, neither method yielded DNA of good quality. However, the dilution test applied in the CTAB extracts showed no interference of inhibitory substances. The PCR efficiencies of lectin target amplification were not statistically different, and the coefficients of correlation (R²) demonstrated high degree of correlation between the copy numbers and the threshold cycle (Ct) values. ANOVA showed suitable adjustment of the regression and absence of significant linear deviations. The efficiencies of the p35S amplification were not statistically different, and all R² values using DNeasy extracts were above 0.98 with no significant linear deviations. Two out of three R² values using CTAB extracts were lower than 0.98, corresponding to lower degree of correlation, and the lack-of-fit test showed significant linear deviation in one run. The comparative analysis of the Ct values for the p35S and lectin targets demonstrated no statistical significant differences between the analytical curves of each target.
Resumo:
The aims of this study were to use the isotope analysis method to quantify the carbon of C3 photosynthetic cycle in commercial apple nectars and to determine the legal limit to identify the beverages that do not conform to the safety standards established by the Brazilian Ministry of Agriculture, Livestock and Food Supply. These beverages (apple nectars) were produced in the laboratory according to the Brazilian legislation. Adulterated nectars were also produced with an amount of pulp juice below the permitted threshold limit value. The δ13C values of the apple nectars and their fractions (pulp and purified sugar) were measured to quantify the C3 source percentage. In order to demonstrate the existence of adulteration, the values found were compared to the limit values established by the Brazilian Law. All commercial apple nectars analyzed were within the legal limits, which enabled to identify the nectars that were in conformity with the Brazilian Law. The isotopic methodology developed proved efficient to quantify the carbon of C3 origin in commercial apple nectars.
Resumo:
Abstract The growing interest in the usage of dietary fiber in food has caused the need to provide precise tools for describing its physical properties. This research examined two dietary fibers from oats and beets, respectively, in variable particle sizes. The application of automated static image analysis for describing the hydration properties and particle size distribution of dietary fiber was analyzed. Conventional tests for water holding capacity (WHC) were conducted. The particles were measured at two points: dry and after water soaking. The most significant water holding capacity (7.00 g water/g solid) was achieved by the smaller sized oat fiber. Conversely, the water holding capacity was highest (4.20 g water/g solid) in larger sized beet fiber. There was evidence for water absorption increasing with a decrease in particle size in regards to the same fiber source. Very strong correlations were drawn between particle shape parameters, such as fiber length, straightness, width and hydration properties measured conventionally. The regression analysis provided the opportunity to estimate whether the automated static image analysis method could be an efficient tool in describing the hydration properties of dietary fiber. The application of the method was validated using mathematical model which was verified in comparison to conventional WHC measurement results.
Resumo:
The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.
Resumo:
Digitalization and technology megatrends such as Cloud services have provided SMEs with a suitable atmosphere and conditions to internationalize and seek for further business growth. There is a limited amount of research on Cloud services from the business perspective and the limitations and challenges SMEs encounter when pursuing international business growth. Thus, the main research question of this study was how Cloud services may enable Finnish SMEs to overcome international growth challenges. The research question was further divided into three sub-questions dealing with matters related to features and characteristics of Cloud services, limitations and challenges Finnish SMEs experience when pursuing international growth of business, and benefits and advantages of utilizing Cloud services to mitigate and suppress international growth challenges. First, the theoretical framework of this study was constructed based on the existing literature on Cloud services, SMEs, and international growth challenges. After this, qualitative research approach and methodology were applied for this study. The data was collected through six semi-structured expert interviews in person with representatives of IBM, Exidio, Big Data Solutions, and Comptel. After analyzing the collected data by applying thematic analysis method, the results were compared with the existing theory and the original framework was modified and complemented accordingly. Resource scarcity, customer base expansion and retention, and lack of courage to try new things and take risks turned out to be major international growth challenges of Finnish SMEs. Due to a number of benefits and advantages of utilizing Cloud services including service automation, consumption-based pricing model, lack of capital expenditures (capex) and huge upfront investments, lightened organization structure, cost savings, speed, accessibility, scalability, agility, geographical expansion potential, global reaching and covering, credibility, partners, enhanced CRM, freedom, and flexibility, it can be concluded that Cloud services can help directly and indirectly Finnish SMEs to mitigate and overcome international growth challenges and enable further business growth.