886 resultados para Use of web-based educational resources
Resumo:
In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.
Resumo:
The application of nanotechnology to medicine can provide important benefits, especially in oncology, a fact that has resulted in the emergence of a new field called Nanooncology. Nanoparticles can be engineered to incorporate a wide variety of chemotherapeutic or diagnostic agents. A nanocapsule is a vesicular system that exhibits a typical core-shell structure in which active molecules are confined to a reservoir or within a cavity that is surrounded by a polymer membrane or coating. Delivery systems based on nanocapsules are usually transported to a targeted tumor site and then release their contents upon change in environmental conditions. An effective delivery of the therapeutic agent to the tumor site and to the infiltrating tumor cells is difficult to achieve in many cancer treatments. Therefore, new devices are being developed to facilitate intratumoral distribution, to protect the active agent from premature degradation and to allow its sustained and controlled release. This review focuses on recent studies on the use of nanocapsules for cancer therapy and diagnosis.
Resumo:
This study aimed to analyze the agreement between measurements of unloaded oxygen uptake and peak oxygen uptake based on equations proposed by Wasserman and on real measurements directly obtained with the ergospirometry system. We performed an incremental cardiopulmonary exercise test (CPET), which was applied to two groups of sedentary male subjects: one apparently healthy group (HG, n=12) and the other had stable coronary artery disease (n=16). The mean age in the HG was 47±4 years and that in the coronary artery disease group (CG) was 57±8 years. Both groups performed CPET on a cycle ergometer with a ramp-type protocol at an intensity that was calculated according to the Wasserman equation. In the HG, there was no significant difference between measurements predicted by the formula and real measurements obtained in CPET in the unloaded condition. However, at peak effort, a significant difference was observed between oxygen uptake (V˙O2)peak(predicted)and V˙O2peak(real)(nonparametric Wilcoxon test). In the CG, there was a significant difference of 116.26 mL/min between the predicted values by the formula and the real values obtained in the unloaded condition. A significant difference in peak effort was found, where V˙O2peak(real)was 40% lower than V˙O2peak(predicted)(nonparametric Wilcoxon test). There was no agreement between the real and predicted measurements as analyzed by Lin’s coefficient or the Bland and Altman model. The Wasserman formula does not appear to be appropriate for prediction of functional capacity of volunteers. Therefore, this formula cannot precisely predict the increase in power in incremental CPET on a cycle ergometer.
Resumo:
Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
The DNA extraction is a critical step in Genetically Modified Organisms analysis based on real-time PCR. In this study, the CTAB and DNeasy methods provided good quality and quantity of DNA from the texturized soy protein, infant formula, and soy milk samples. Concerning the Certified Reference Material consisting of 5% Roundup Ready® soybean, neither method yielded DNA of good quality. However, the dilution test applied in the CTAB extracts showed no interference of inhibitory substances. The PCR efficiencies of lectin target amplification were not statistically different, and the coefficients of correlation (R²) demonstrated high degree of correlation between the copy numbers and the threshold cycle (Ct) values. ANOVA showed suitable adjustment of the regression and absence of significant linear deviations. The efficiencies of the p35S amplification were not statistically different, and all R² values using DNeasy extracts were above 0.98 with no significant linear deviations. Two out of three R² values using CTAB extracts were lower than 0.98, corresponding to lower degree of correlation, and the lack-of-fit test showed significant linear deviation in one run. The comparative analysis of the Ct values for the p35S and lectin targets demonstrated no statistical significant differences between the analytical curves of each target.
Resumo:
A test that is rapid, simple, accurate, not expensive, gives rapid results, and is sensitive enough to detect low levels of microorganisms would be the most suitable for food industry routine laboratories, or even for a public health laboratories. A ready-to-use alternative, commercially available method is the PetrifilmTM EB method. The aim of this study was to evaluate whether there is a statistically significant difference between the conventional methods based on Violet Red Bile Glucose Agar and the alternative 3M TM Petrifilm (EB) method for the enumeration of Enterobacteriaceae in poultry carcasses. This study also assessed whether the alternative method showed ability to produce results that were directly proportional to the concentration of the target (approximately 270 colony-forming unit.mL-1). A total of 120 poultry carcasses samples showed a significant difference (p < 0.05) between the populations obtained by the two methods, and the conventional method showed low proportionality between the dilutions. On the other hand, the PetrifilmTM EB quantification system showed the capacity to produce results that are proportional to the concentration of the analyte in samples in the concentration range from 1 to 256 colony-forming unit.mL-1.
Resumo:
Sweeteners based on stevia extract contain a series of diterpene glycosides derivatives from steviol, standing out the rebaudioside-A. There is no tabletop sweeteners in the market formulated purely with rebaudioside-A yet, so its use in foods depends on the development of new products followed by physicochemical and sensory evaluations. This work presents the formulation of a diet strawberry jam dyed with cranberry juice and sweetened with rebaudioside-A purified from stevia plants of the lineage UEM-320 developed in the Centro de Estudos de Produtos Naturais da Universidade Estadual de Maringá. Evaluations of physicochemical properties, microbiological and sensory characteristics were carried out for the product in comparison with a control sweetened with equal amount of sucralose. The results showed that the physicochemical characteristics of the sample and the control are not significantly different and the supplementation with cranberry juice increased both color and total phenolic content in both samples. The sensory acceptability indicated a significant preference for the formulation sweetened with 100% of rebaudioside-A, only in the items flavor and purchase intent. We concluded that rebaudioside-A has a better sensory performance than sucralose, even this last one being 1.33 fold sweeter than rebaudioside-A.
Resumo:
AbstractPurple sweet potato (PSP) can provide products with attractive color besides nutritious benefits in food processing. So, the compositions and color stability of an aqueous anthocyanin-based PSP extract were investigated in order to promote its wide use in food industry. PSP anthocyanins were extracted with water, and nine individual anthocyanins (48.72 ug mL–1 in total, 24.36 mg/100 g fresh PSP in yield) were found by HPLC analysis. The PSP extract also contained 17.11 mg mL–1 of protein, 0.44 mg mL–1 of dietary fiber, 2.82 mg mL–1 of reducing sugars, 4.02 ug mL–1 of Se, 54.21 ug mL–1 of Ca and 60.83 ug mL–1 of Mg. Changes in color and stability of the PSP extract, as affected by pH, heat, light and extraction process, were further evaluated. Results indicated that PSP anthocyanins had good stability at pH 2.0-6.0, while the color of PSP extract kept stable during 30 days of storage at 20 °C in dark. Both UV and fluorescent exposure weakened the color stability of PSP extract and UV showed a more drastic effect in comparison. A steaming pretreatment of fresh PSP is beneficial to the color stability.
Resumo:
Most of the applications of airborne laser scanner data to forestry require that the point cloud be normalized, i.e., each point represents height from the ground instead of elevation. To normalize the point cloud, a digital terrain model (DTM), which is derived from the ground returns in the point cloud, is employed. Unfortunately, extracting accurate DTMs from airborne laser scanner data is a challenging task, especially in tropical forests where the canopy is normally very thick (partially closed), leading to a situation in which only a limited number of laser pulses reach the ground. Therefore, robust algorithms for extracting accurate DTMs in low-ground-point-densitysituations are needed in order to realize the full potential of airborne laser scanner data to forestry. The objective of this thesis is to develop algorithms for processing airborne laser scanner data in order to: (1) extract DTMs in demanding forest conditions (complex terrain and low number of ground points) for applications in forestry; (2) estimate canopy base height (CBH) for forest fire behavior modeling; and (3) assess the robustness of LiDAR-based high-resolution biomass estimation models against different field plot designs. Here, the aim is to find out if field plot data gathered by professional foresters can be combined with field plot data gathered by professionally trained community foresters and used in LiDAR-based high-resolution biomass estimation modeling without affecting prediction performance. The question of interest in this case is whether or not the local forest communities can achieve the level technical proficiency required for accurate forest monitoring. The algorithms for extracting DTMs from LiDAR point clouds presented in this thesis address the challenges of extracting DTMs in low-ground-point situations and in complex terrain while the algorithm for CBH estimation addresses the challenge of variations in the distribution of points in the LiDAR point cloud caused by things like variations in tree species and season of data acquisition. These algorithms are adaptive (with respect to point cloud characteristics) and exhibit a high degree of tolerance to variations in the density and distribution of points in the LiDAR point cloud. Results of comparison with existing DTM extraction algorithms showed that DTM extraction algorithms proposed in this thesis performed better with respect to accuracy of estimating tree heights from airborne laser scanner data. On the other hand, the proposed DTM extraction algorithms, being mostly based on trend surface interpolation, can not retain small artifacts in the terrain (e.g., bumps, small hills and depressions). Therefore, the DTMs generated by these algorithms are only suitable for forestry applications where the primary objective is to estimate tree heights from normalized airborne laser scanner data. On the other hand, the algorithm for estimating CBH proposed in this thesis is based on the idea of moving voxel in which gaps (openings in the canopy) which act as fuel breaks are located and their height is estimated. Test results showed a slight improvement in CBH estimation accuracy over existing CBH estimation methods which are based on height percentiles in the airborne laser scanner data. However, being based on the idea of moving voxel, this algorithm has one main advantage over existing CBH estimation methods in the context of forest fire modeling: it has great potential in providing information about vertical fuel continuity. This information can be used to create vertical fuel continuity maps which can provide more realistic information on the risk of crown fires compared to CBH.
Resumo:
The occurrence of green seeded soybeans [Glycine max (L.) Merrill] is a problem closely related to unfavorable climatic conditions, mainly drought, that occurs during the final stages of seed maturation. This problem causes serious losses to soybean seed quality in Brazil. In these seeds, chlorophyll is not properly degraded during maturation, drastically reducing seed quality. Using the chlorophyll fluorescence technique, it is possible to remove green seeds from the seed lot, improving seed quality in several species in which the occurrence of green seeds is also a problem. The objective of this research was to study the use of the chlorophyll fluorescence technique in sorting green seeds from soybean seed samples and its effects on quality. Five seed samples of soybean, cultivar TMG 113 RR, with 0%, 5%, 10%, 15%, and 20% of green seeds were used in this study. Seeds from each sample were sorted into two fractions based on the chlorophyll fluorescence signals and then compared to the control (non-sorted seeds). The sorting process showed great differences between the low and high chlorophyll fluorescence fractions. It was concluded that: green seeds of soybeans present high chlorophyll fluorescence and that this characteristic affects the quality of the seeds; it is possible to improve the quality of soybean seed by removing green seeds using the chlorophyll fluorescence sorting technique.
Resumo:
The increased awareness and evolved consumer habits have set more demanding standards for the quality and safety control of food products. The production of foodstuffs which fulfill these standards can be hampered by different low-molecular weight contaminants. Such compounds can consist of, for example residues of antibiotics in animal use or mycotoxins. The extremely small size of the compounds has hindered the development of analytical methods suitable for routine use, and the methods currently in use require expensive instrumentation and qualified personnel to operate them. There is a need for new, cost-efficient and simple assay concepts which can be used for field testing and are capable of processing large sample quantities rapidly. Immunoassays have been considered as the golden standard for such rapid on-site screening methods. The introduction of directed antibody engineering and in vitro display technologies has facilitated the development of novel antibody based methods for the detection of low-molecular weight food contaminants. The primary aim of this study was to generate and engineer antibodies against low-molecular weight compounds found in various foodstuffs. The three antigen groups selected as targets of antibody development cause food safety and quality defects in wide range of products: 1) fluoroquinolones: a family of synthetic broad-spectrum antibacterial drugs used to treat wide range of human and animal infections, 2) deoxynivalenol: type B trichothecene mycotoxin, a widely recognized problem for crops and animal feeds globally, and 3) skatole, or 3-methyindole is one of the two compounds responsible for boar taint, found in the meat of monogastric animals. This study describes the generation and engineering of antibodies with versatile binding properties against low-molecular weight food contaminants, and the consecutive development of immunoassays for the detection of the respective compounds.
Resumo:
Individuals with disabiliiies are increasingly accessing post secondary education opportunities to further develop their educational and career goals. This study examines the current facilitative practices of Canadian university activity-based physical education degree programs on the participation of individuals with disabilities. A critical orientation and descriptive/interpretative approach allows insight into unique stories and experiences of physical education practitioners and special needs professionals as they attempt to provide equitable educational experiences within a least restrictive environment. Leading practitioners are used to triangulate and strengthen the validity of the data while providing direction and advocacy for future development and inclusion of individuals with disabilities. The study concludes with seven recommendations, each providing university activity-based physical education degree programs with viable opportunities for helping create equitable opportunities for individuals with disabilities.
Resumo:
This research investigated the impact of stress management and relaxation techniques on psoriasis. It had a dual purpose to see if stress management and relaxation techniques, as an adjunct to traditional medical treatment, would improve the skin condition of psoriasis. In addition it attempted to provide psoriasis patients with a sense of control over their illness by educating them about the connection between mind and body through learning stress management and relaxation techniques. The former purpose was addressed quantitatively, while the latter was addressed qualitatively. Using an experimental design, the quantitative study tested the efficacy of stress management and relaxation techniques on 38 dermatological patients from St. John's, Newfoundland. The study which lasted ten weeks, suggested a weak relationship between psoriasis and stress. These relationships were not statistically significant. The qualitative data were gathered through unstructured interviews and descriptive/interpretative analysis was used to evaluate them. Patients in the experimental group believed in the mind body connection as it related to their illness and stress. The findings also showed that the patients believed that the stress reduction and relaxation techniques improved their quality of life, their level of psoriasis, and their ability to live with the condition. Based on the contradictory nature of the findings, further research is needed. It is posited that replication of this study would be vastly improved by increasing the sample size to increase the possibility of significant findings. As wel~ increasing the length of time for the experiment would control for the possibility of a lag effect. Finally, the study looked at linear relationships between stress and psoriasis. Further study should ascertain whether the relationship might be nonlinear
Resumo:
The last several decades have been marked by tremendous changes in education - technological, pedagogical, administrative, and social. These changes have led to considerable increments in the budgets devoted to professional development for teachers ~ with the express purpose of helping them accommodate their practices to the new realities oftheir classrooms. However, research has suggested that, in spite of the emphasis placed on encouraging sustained change in teaching practices, little has been accomplished. This begs the question of what ought to be done to not only reverse this outcome, but contribute to transformational change. The literature suggests some possibilities including: a) considering teachers as learners and applying what, is known about cognition and learning; b) modifying the location and nature ofprofessional development so that it is authentic, based in the classroom and focusing on tasks meaningful to the teacher; c) attending to the infrastructure underlying professional development; and d) ensuring opportunities for reflective practice. This dissertation looks at the impact of each ofthese variables through an analysis ofthe learning journeys of a group ofteachers engaged in a program called GrassRoots in one midsized school board in Ontario. Action research was conducted by the researcher in his role as consultant facilitating teacher professional growth around the use of Web sites as culminating performance tasks by students. Research focused on the pedagogical approach to the learning of the teachers involved and the infrastructure underlying their learning. Using grounded theory, a model for professional development was developed that can be used in the future to inform practices and, hopefully, lead to sustained transformational school change.