12 resultados para Synthetic and analytic methods

em Repositório Científico do Instituto Politécnico de Lisboa - Portugal


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Czech schools two teaching methods of reading are used: the analytic-synthetic (conventional) and genetic (created in the 1990s). They differ in theoretical foundations and in methodology. The aim of this paper is to describe the above mentioned theoretical approaches and present the results of study that followed the differences in the development of initial reading skills between these methods. A total of 452 first grade children (age 6-8) were assessed by a battery of reading tests at the beginning and at the end of the first grade and at the beginning of the second grade. 350 pupils participated all three times. Based on data analysis the developmental dynamics of reading skills in both methods and the main differences in several aspects of reading abilities (e.g. the speed of reading, reading technique, error rate in reading) are described. The main focus is on the reading comprehension development. Results show that pupils instructed using genetic approach scored significantly better on used reading comprehension tests, especially in the first grade. Statistically significant differences occurred between classes independently of each method. Therefore, other factors such as teacher´s role and class composition are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Epidemiological studies showed increased prevalence of respiratory symptoms and adverse changes in pulmonary function parameters in poultry workers, corroborating the increased exposure to risk factors, such as fungal load and their metabolites. This study aimed to determine the occupational exposure threat due to fungal contamination caused by the toxigenic isolates belonging to the complex of the species of Aspergillus flavus and also isolates fromAspergillus fumigatus species complex. The study was carried out in seven Portuguese poultries, using cultural and molecularmethodologies. For conventional/cultural methods, air, surfaces, and litter samples were collected by impaction method using the Millipore Air Sampler. For the molecular analysis, air samples were collected by impinger method using the Coriolis μ air sampler. After DNA extraction, samples were analyzed by real-time PCR using specific primers and probes for toxigenic strains of the Aspergillus flavus complex and for detection of isolates from Aspergillus fumigatus complex. Through conventional methods, and among the Aspergillus genus, different prevalences were detected regarding the presence of Aspergillus flavus and Aspergillus fumigatus species complexes, namely: 74.5 versus 1.0% in the air samples, 24.0 versus 16.0% in the surfaces, 0 versus 32.6% in new litter, and 9.9 versus 15.9%in used litter. Through molecular biology, we were able to detect the presence of aflatoxigenic strains in pavilions in which Aspergillus flavus did not grow in culture. Aspergillus fumigatus was only found in one indoor air sample by conventional methods. Using molecular methodologies, however, Aspergillus fumigatus complex was detected in seven indoor samples from three different poultry units. The characterization of fungal contamination caused by Aspergillus flavus and Aspergillus fumigatus raises the concern of occupational threat not only due to the detected fungal load but also because of the toxigenic potential of these species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The handling of waste and compost that occurs frequently in composting plants (compost turning, shredding, and screening) has been shown to be responsible for the release of dust and air borne microorganisms and their compounds in the air. Thermophilic fungi, such as A. fumigatus, have been reported and this kind of contamination in composting facilities has been associated with increased respiratory symptoms among compost workers. This study intended to characterize fungal contamination in a totally indoor composting plant located in Portugal. Besides conventional methods, molecular biology was also applied to overcome eventual limitations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In hyperspectral imagery a pixel typically consists mixture of spectral signatures of reference substances, also called endmembers. Linear spectral mixture analysis, or linear unmixing, aims at estimating the number of endmembers, their spectral signatures, and their abundance fractions. This paper proposes a framework for hyperpsectral unmixing. A blind method (SISAL) is used for the estimation of the unknown endmember signature and their abundance fractions. This method solve a non-convex problem by a sequence of augmented Lagrangian optimizations, where the positivity constraints, forcing the spectral vectors to belong to the convex hull of the endmember signatures, are replaced by soft constraints. The proposed framework simultaneously estimates the number of endmembers present in the hyperspectral image by an algorithm based on the minimum description length (MDL) principle. Experimental results on both synthetic and real hyperspectral data demonstrate the effectiveness of the proposed algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanotechnology is an important emerging industry with a projected annual market of around one trillion dollars by 2015. It involves the control of atoms and molecules to create new materials with a variety of useful functions. Although there are advantages on the utilization of these nano-scale materials, questions related with its impact over the environment and human health must be addressed too, so that potential risks can be limited at early stages of development. At this time, occupational health risks associated with manufacturing and use of nanoparticles are not yet clearly understood. However, workers may be exposed to nanoparticles through inhalation at levels that can greatly exceed ambient concentrations. Current workplace exposure limits are based on particle mass, but this criteria could not be adequate in this case as nanoparticles are characterized by very large surface area, which has been pointed out as the distinctive characteristic that could even turn out an inert substance into another substance exhibiting very different interactions with biological fluids and cells. Therefore, it seems that, when assessing human exposure based on the mass concentration of particles, which is widely adopted for particles over 1 μm, would not work in this particular case. In fact, nanoparticles have far more surface area for the equivalent mass of larger particles, which increases the chance they may react with body tissues. Thus, it has been claimed that surface area should be used for nanoparticle exposure and dosing. As a result, assessing exposure based on the measurement of particle surface area is of increasing interest. It is well known that lung deposition is the most efficient way for airborne particles to enter the body and cause adverse health effects. If nanoparticles can deposit in the lung and remain there, have an active surface chemistry and interact with the body, then, there is potential for exposure. It was showed that surface area plays an important role in the toxicity of nanoparticles and this is the metric that best correlates with particle-induced adverse health effects. The potential for adverse health effects seems to be directly proportional to particle surface area. The objective of the study is to identify and validate methods and tools for measuring nanoparticles during production, manipulation and use of nanomaterials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impact of mycotoxins on human and animal health is well recognized. Aflatoxin B1 (AFB1) is by far the most prevalent and the most potent natural carcinogen and is usually the major aflatoxin produced by toxigenic fungal strains. Data available, points to an increasing frequency of poultry feed contamination by aflatoxins. Since aflatoxin residues may accumulate in body tissues, this represents a high risk to human health. Samples from commercial poultry birds have already presented detectable levels of aflatoxin in liver. A descriptive study was developed in order to assess fungal contamination by species from Aspergillus flavus complex in seven Portuguese poultry units. Air fungal contamination was studied by conventional and molecular methods. Air, litter and surfaces samples were collected. To apply molecular methods, air samples of 300L were collected using the Coriolis μ air sampler (Bertin Technologies), at 300 L/min airflow rate. For conventional methodologies, all the collected samples were incubated at 27ºC for five to seven days. Through conventional methods, Aspergillus flavus was the third fungal species (7%) most frequently found in 27 indoor air samples analysed and the most commonly isolated species (75%) in air samples containing only the Aspergillus genus...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Energia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud SLAs compensate customers with credits when average availability drops below certain levels. This is too inflexible because consumers lose non-measurable amounts of performance being only compensated later, in next charging cycles. We propose to schedule virtual machines (VMs), driven by range-based non-linear reductions of utility, different for classes of users and across different ranges of resource allocations: partial utility. This customer-defined metric, allows providers transferring resources between VMs in meaningful and economically efficient ways. We define a comprehensive cost model incorporating partial utility given by clients to a certain level of degradation, when VMs are allocated in overcommitted environments (Public, Private, Community Clouds). CloudSim was extended to support our scheduling model. Several simulation scenarios with synthetic and real workloads are presented, using datacenters with different dimensions regarding the number of servers and computational capacity. We show the partial utility-driven driven scheduling allows more VMs to be allocated. It brings benefits to providers, regarding revenue and resource utilization, allowing for more revenue per resource allocated and scaling well with the size of datacenters when comparing with an utility-oblivious redistribution of resources. Regarding clients, their workloads’ execution time is also improved, by incorporating an SLA-based redistribution of their VM’s computational power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Liquid crystalline cellulosic-based solutions described by distinctive properties are at the origin of different kinds of multifunctional materials with unique characteristics. These solutions can form chiral nematic phases at rest, with tuneable photonic behavior, and exhibit a complex behavior associated with the onset of a network of director field defects under shear. Techniques, such as Nuclear Magnetic Resonance (NMR), Rheology coupled with NMR (Rheo-NMR), rheology, optical methods, Magnetic Resonance Imaging (MRI), Wide Angle X-rays Scattering (WAXS), were extensively used to enlighten the liquid crystalline characteristics of these cellulosic solutions. Cellulosic films produced by shear casting and fibers by electrospinning, from these liquid crystalline solutions, have regained wider attention due to recognition of their innovative properties associated to their biocompatibility. Electrospun membranes composed by helical and spiral shape fibers allow the achievement of large surface areas, leading to the improvement of the performance of this kind of systems. The moisture response, light modulated, wettability and the capability of orienting protein and cellulose crystals, opened a wide range of new applications to the shear casted films. Characterization by NMR, X-rays, tensile tests, AFM, and optical methods allowed detailed characterization of those soft cellulosic materials. In this work, special attention will be given to recent developments, including, among others, a moisture driven cellulosic motor and electro-optical devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clustering ensemble methods produce a consensus partition of a set of data points by combining the results of a collection of base clustering algorithms. In the evidence accumulation clustering (EAC) paradigm, the clustering ensemble is transformed into a pairwise co-association matrix, thus avoiding the label correspondence problem, which is intrinsic to other clustering ensemble schemes. In this paper, we propose a consensus clustering approach based on the EAC paradigm, which is not limited to crisp partitions and fully exploits the nature of the co-association matrix. Our solution determines probabilistic assignments of data points to clusters by minimizing a Bregman divergence between the observed co-association frequencies and the corresponding co-occurrence probabilities expressed as functions of the unknown assignments. We additionally propose an optimization algorithm to find a solution under any double-convex Bregman divergence. Experiments on both synthetic and real benchmark data show the effectiveness of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Evidence Accumulation Clustering (EAC) paradigm is a clustering ensemble method which derives a consensus partition from a collection of base clusterings obtained using different algorithms. It collects from the partitions in the ensemble a set of pairwise observations about the co-occurrence of objects in a same cluster and it uses these co-occurrence statistics to derive a similarity matrix, referred to as co-association matrix. The Probabilistic Evidence Accumulation for Clustering Ensembles (PEACE) algorithm is a principled approach for the extraction of a consensus clustering from the observations encoded in the co-association matrix based on a probabilistic model for the co-association matrix parameterized by the unknown assignments of objects to clusters. In this paper we extend the PEACE algorithm by deriving a consensus solution according to a MAP approach with Dirichlet priors defined for the unknown probabilistic cluster assignments. In particular, we study the positive regularization effect of Dirichlet priors on the final consensus solution with both synthetic and real benchmark data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hyperspectral unmixing methods aim at the decomposition of a hyperspectral image into a collection endmember signatures, i.e., the radiance or reflectance of the materials present in the scene, and the correspondent abundance fractions at each pixel in the image. This paper introduces a new unmixing method termed dependent component analysis (DECA). This method is blind and fully automatic and it overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical based approaches. DECA is based on the linear mixture model, i.e., each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. These abundances are modeled as mixtures of Dirichlet densities, thus enforcing the non-negativity and constant sum constraints, imposed by the acquisition process. The endmembers signatures are inferred by a generalized expectation-maximization (GEM) type algorithm. The paper illustrates the effectiveness of DECA on synthetic and real hyperspectral images.