832 resultados para Recursive Filtering
Resumo:
The aim of this study was to introduce the tangential microfiltration (TMF) technique on the production of orange juice (TMFJ), and compare it with pasteurised juice (control) as regards chemical composition and sensorial characteristics. We used a TMF pilot equipped with four monotubular ceramic membranes (0.1, 0.2, 0.8 and 1.4mm) arranged in series with a filtering area of 0.005 m² each. Commercial flash-pasteurised orange juice was used as the initial product. Experiments were divided into three parts: a) the characterisation of the TMF pilot; b) optimisation of operational conditions; c) production of the TMFJ. In the second part, membrane with 0.8-mm pores presented best flux followed by those with 1.4-, 0.1-, and 0.2-mm pores. However, to guarantee permeate sterility, we chose the membrane with 0.1-mm pores for TMFJ production. Initially, the orange juice was sieved in order to separate part of the pulp, being subsequently submitted to TMF. A mixture of retentate and pulp was made, and was subsequently pasteurised. We obtained the TMFJ by adding the permeate to the mixture. TMFJ presented soluble solids content (°Brix), pulp, pH, and titrable acidity similar to the initial pasteurised juice (control). Nevertheless, 28% of vitamin C was lost during the TMFJ production. According to the juice taster panel, the control juice presented best sensorial characteristics (greater aroma intensity and fruity flavour) when compared with the TMJF.
Resumo:
Point-of-care (POC) –diagnostics is a field with rapidly growing market share. As these applications become more widely used, there is an increasing pressure to improve their performance to match the one of a central laboratory tests. Lanthanide luminescence has been widely utilized in diagnostics because of the numerous advantages gained by the utilization of time-resolved or anti-Stokes detection. So far the use of lanthanide labels in POC has been scarce due to limitations set by the instrumentation required for their detection and the shortcomings, e.g. low brightness, of these labels. Along with the advances in the research of lanthanide luminescence, and in the field of semiconductors, these materials are becoming a feasible alternative for the signal generation also in the future POC assays. The aim of this thesis was to explore ways of utilizing time-resolved detection or anti-Stokes detection in POC applications. The long-lived fluorescence for the time-resolved measurement can be produced with lanthanide chelates. The ultraviolet (UV) excitation required by these chelates is cumbersome to produce with POC compatible fluorescence readers. In this thesis the use of a novel light-harvesting ligand was studied. This molecule can be used to excite Eu(III)-ions at wavelengths extending up to visible part of the spectrum. An enhancement solution based on this ligand showed a good performance in a proof-of-concept -bioaffinity assay and produced a bright signal upon 365 nm excitation thanks to the high molar absorptivity of the chelate. These features are crucial when developing miniaturized readers for the time-resolved detection of fluorescence. Upconverting phosphors (UCPs) were studied as an internal light source in glucose-sensing dry chemistry test strips and ways of utilizing their various emission wavelengths and near-infrared excitation were explored. The use of nanosized NaYF :Yb3+,Tm3+-particles enabled the replacement of an external UV-light source with a NIR-laser and gave an additional degree of freedom in the optical setup of the detector instrument. The new method enabled a blood glucose measurement with results comparable to a current standard method of measuring reflectance. Microsized visible emitting UCPs were used in a similar manner, but with a broad absorbing indicator compound filtering the excitation and emission wavelengths of the UCP. This approach resulted in a novel way of benefitting from the non-linear relationship between the excitation power and emission intensity of the UCPs, and enabled the amplification of the signal response from the indicator dye.
Resumo:
This thesis introduces an extension of Chomsky’s context-free grammars equipped with operators for referring to left and right contexts of strings.The new model is called grammar with contexts. The semantics of these grammars are given in two equivalent ways — by language equations and by logical deduction, where a grammar is understood as a logic for the recursive definition of syntax. The motivation for grammars with contexts comes from an extensive example that completely defines the syntax and static semantics of a simple typed programming language. Grammars with contexts maintain most important practical properties of context-free grammars, including a variant of the Chomsky normal form. For grammars with one-sided contexts (that is, either left or right), there is a cubic-time tabular parsing algorithm, applicable to an arbitrary grammar. The time complexity of this algorithm can be improved to quadratic,provided that the grammar is unambiguous, that is, it only allows one parsefor every string it defines. A tabular parsing algorithm for grammars withtwo-sided contexts has fourth power time complexity. For these grammarsthere is a recognition algorithm that uses a linear amount of space. For certain subclasses of grammars with contexts there are low-degree polynomial parsing algorithms. One of them is an extension of the classical recursive descent for context-free grammars; the version for grammars with contexts still works in linear time like its prototype. Another algorithm, with time complexity varying from linear to cubic depending on the particular grammar, adapts deterministic LR parsing to the new model. If all context operators in a grammar define regular languages, then such a grammar can be transformed to an equivalent grammar without context operators at all. This allows one to represent the syntax of languages in a more succinct way by utilizing context specifications. Linear grammars with contexts turned out to be non-trivial already over a one-letter alphabet. This fact leads to some undecidability results for this family of grammars
Resumo:
Mineraalien rikastamiseen käytetään useita fysikaalisia ja kemiallisia menetelmiä. Prosessi sisältää malmin hienonnuksen, rikastuksen ja lopuksi vedenpoistamisen rikastelietteestä. Malmin rikastamiseen käytetään muun muassa vaahdotusta, liuotusta, magneettista rikastusta ja tiheyseroihin perustuvia rikastusmenetelmiä. Rikastuslietteestä voidaan poistaa vettä sakeuttamalla ja suodattamalla. Rikastusprosessin ympäristövaikutuksia voidaan arvioida laskemalla tuotteen vesijalanjälki, joka kertoo valmistamiseen kulutetun veden määrän. Tässä kirjallisuustyössä esiteltiin mineraalien käsittelymenetelmiä sekä prosessijätevesien puhdistusmenetelmiä. Kirjallisuuslähteiden pohjalta selvitettiin Pyhäsalmen kaivoksella valmistetun kuparianodin vesijalanjälki sekä esitettiin menetelmiä, joilla prosessiin tarvittavan raakaveden kulutusta voitaisiin vähentää. Pyhäsalmella kuparirikasteesta valmistetun kuparianodin vesijalanjälki on 240 litraa H2O ekvivalenttia tuotettua tonnia kohden. Pyhäsalmen prosessin raakaveden kulutusta voidaan vähentää lisäämällä sisäistä vedenkierrätystä. Kalsiumsulfaatin saostuminen putkiin ja pumppuihin on ilmentynyt ongelmaksi vedenkierrätyksen lisäämisessä. Kalsiumsulfaattia voidaan erottaa vedestä membraaneihin, ioninvaihtoon ja sähkökemiaan perustuvilla tekniikoilla. Vaihtoehdossa, jossa johdetaan kaikista kolmesta vaahdotuksesta saatavat rikastuslietteen ja rikastushiekan sakeutuksien ylitteet sekä suodatuksien suodosvedet samaan vedenkäsittelyyn voidaan kattaa arviolta noin 65 % koko veden tarpeesta. Raakavettä säästetään vuodessa 3,4 Mm^3 ja samalla rikastushiekka-altaiden tarvittava koko pienenee, joka vähentää ympäristöriskejä.
Resumo:
Most of the applications of airborne laser scanner data to forestry require that the point cloud be normalized, i.e., each point represents height from the ground instead of elevation. To normalize the point cloud, a digital terrain model (DTM), which is derived from the ground returns in the point cloud, is employed. Unfortunately, extracting accurate DTMs from airborne laser scanner data is a challenging task, especially in tropical forests where the canopy is normally very thick (partially closed), leading to a situation in which only a limited number of laser pulses reach the ground. Therefore, robust algorithms for extracting accurate DTMs in low-ground-point-densitysituations are needed in order to realize the full potential of airborne laser scanner data to forestry. The objective of this thesis is to develop algorithms for processing airborne laser scanner data in order to: (1) extract DTMs in demanding forest conditions (complex terrain and low number of ground points) for applications in forestry; (2) estimate canopy base height (CBH) for forest fire behavior modeling; and (3) assess the robustness of LiDAR-based high-resolution biomass estimation models against different field plot designs. Here, the aim is to find out if field plot data gathered by professional foresters can be combined with field plot data gathered by professionally trained community foresters and used in LiDAR-based high-resolution biomass estimation modeling without affecting prediction performance. The question of interest in this case is whether or not the local forest communities can achieve the level technical proficiency required for accurate forest monitoring. The algorithms for extracting DTMs from LiDAR point clouds presented in this thesis address the challenges of extracting DTMs in low-ground-point situations and in complex terrain while the algorithm for CBH estimation addresses the challenge of variations in the distribution of points in the LiDAR point cloud caused by things like variations in tree species and season of data acquisition. These algorithms are adaptive (with respect to point cloud characteristics) and exhibit a high degree of tolerance to variations in the density and distribution of points in the LiDAR point cloud. Results of comparison with existing DTM extraction algorithms showed that DTM extraction algorithms proposed in this thesis performed better with respect to accuracy of estimating tree heights from airborne laser scanner data. On the other hand, the proposed DTM extraction algorithms, being mostly based on trend surface interpolation, can not retain small artifacts in the terrain (e.g., bumps, small hills and depressions). Therefore, the DTMs generated by these algorithms are only suitable for forestry applications where the primary objective is to estimate tree heights from normalized airborne laser scanner data. On the other hand, the algorithm for estimating CBH proposed in this thesis is based on the idea of moving voxel in which gaps (openings in the canopy) which act as fuel breaks are located and their height is estimated. Test results showed a slight improvement in CBH estimation accuracy over existing CBH estimation methods which are based on height percentiles in the airborne laser scanner data. However, being based on the idea of moving voxel, this algorithm has one main advantage over existing CBH estimation methods in the context of forest fire modeling: it has great potential in providing information about vertical fuel continuity. This information can be used to create vertical fuel continuity maps which can provide more realistic information on the risk of crown fires compared to CBH.
Resumo:
This dissertation describes an approach for developing a real-time simulation for working mobile vehicles based on multibody modeling. The use of multibody modeling allows comprehensive description of the constrained motion of the mechanical systems involved and permits real-time solving of the equations of motion. By carefully selecting the multibody formulation method to be used, it is possible to increase the accuracy of the multibody model while at the same time solving equations of motion in real-time. In this study, a multibody procedure based on semi-recursive and augmented Lagrangian methods for real-time dynamic simulation application is studied in detail. In the semirecursive approach, a velocity transformation matrix is introduced to describe the dependent coordinates into relative (joint) coordinates, which reduces the size of the generalized coordinates. The augmented Lagrangian method is based on usage of global coordinates and, in that method, constraints are accounted using an iterative process. A multibody system can be modelled as either rigid or flexible bodies. When using flexible bodies, the system can be described using a floating frame of reference formulation. In this method, the deformation mode needed can be obtained from the finite element model. As the finite element model typically involves large number of degrees of freedom, reduced number of deformation modes can be obtained by employing model order reduction method such as Guyan reduction, Craig-Bampton method and Krylov subspace as shown in this study The constrained motion of the working mobile vehicles is actuated by the force from the hydraulic actuator. In this study, the hydraulic system is modeled using lumped fluid theory, in which the hydraulic circuit is divided into volumes. In this approach, the pressure wave propagation in the hoses and pipes is neglected. The contact modeling is divided into two stages: contact detection and contact response. Contact detection determines when and where the contact occurs, and contact response provides the force acting at the collision point. The friction between tire and ground is modelled using the LuGre friction model, which describes the frictional force between two surfaces. Typically, the equations of motion are solved in the full matrices format, where the sparsity of the matrices is not considered. Increasing the number of bodies and constraint equations leads to the system matrices becoming large and sparse in structure. To increase the computational efficiency, a technique for solution of sparse matrices is proposed in this dissertation and its implementation demonstrated. To assess the computing efficiency, augmented Lagrangian and semi-recursive methods are implemented employing a sparse matrix technique. From the numerical example, the results show that the proposed approach is applicable and produced appropriate results within the real-time period.
Resumo:
Tutkimuksessa käsitelty kemikaalitehdas on 110 kV:n kantaverkkoon liittyvä tehointensiivinen teollisuuslaitos. Prosessien käyttöön mukautetut sähkönjakeluverkon yliaaltosuodattimet ja niiden käyttökytkennät ovat tärkeässä asemassa loistehon tuoton hallitsemiseksi liittymän loistehoikkunaan ja riittävän yliaaltosuodatuksen järjestämiseksi häviöt minimoiden. Kohteena olleen kemikaalitehtaan sähkönjakeluverkon kompensointia ja yliaaltosuodatusta on viimeksi tutkittu vuonna 2003. Tämän jälkeen verkostokomponentit ovat ikääntyneet, prosessien käyttö sekä pienjänniteverkko ovat muuttuneet ja tehdasta käyttävät osittain eri henkilöt. Nykytilaselvitykselle ja verkon kehityskohteiden analysoinnille on tullut tarve edellisen selvityksen jatkoksi. Tutkimus painottuu vahvasti kenttämittauksiin, joiden perusteella sekä kirjallisuutta ja tehtaan järjestelmiä hyödyntäen määritetään loistehotasot verkon keskeisimmissä kohteissa pien-, keski- ja suurjännitetasoilla. Tutkimuksessa esitetään lipeätehtaan suotimien uudelleenjärjestely 4. yliaallon vähentämiseksi ja yksikkökoon pienentämiseksi. Kantaverkon liittymäpisteen tilanne oli hyväksyttävä. Tutkimus esittää pienjännitekompensointia KF-4-100 keskukseen varayhteyden kapasiteettia lisäten. Tutkimus tuotti yleistietoutta verkon käytöstä ja selvitti parhaat käyttökytkennät loistehoikkunan hallitsemiseksi ilman loistehokustannuksia.
Resumo:
Abstract Software product metrics aim at measuring the quality of software. Modu- larity is an essential factor in software quality. In this work, metrics related to modularity and especially cohesion of the modules, are considered. The existing metrics are evaluated, and several new alternatives are proposed. The idea of cohesion of modules is that a module or a class should consist of related parts. The closely related principle of coupling says that the relationships between modules should be minimized. First, internal cohesion metrics are considered. The relations that are internal to classes are shown to be useless for quality measurement. Second, we consider external relationships for cohesion. A detailed analysis using design patterns and refactorings confirms that external cohesion is a better quality indicator than internal. Third, motivated by the successes (and problems) of external cohesion metrics, another kind of metric is proposed that represents the quality of modularity of software. This metric can be applied to refactorings related to classes, resulting in a refactoring suggestion system. To describe the metrics formally, a notation for programs is developed. Because of the recursive nature of programming languages, the properties of programs are most compactly represented using grammars and formal lan- guages. Also the tools that were used for metrics calculation are described.
Resumo:
The Baltic Sea is a unique environment that contains unique genetic populations. In order to study these populations on a genetic level basic molecular research is needed. The aim of this thesis was to provide a basic genetic resource for population genomic studies by de novo assembling a transcriptome for the Baltic Sea isopod Idotea balthica. RNA was extracted from a whole single adult male isopod and sequenced using Illumina (125bp PE) RNA-Seq. The reads were preprocessed using FASTQC for quality control, TRIMMOMATIC for trimming, and RCORRECTOR for error correction. The preprocessed reads were then assembled with TRINITY, a de Bruijn graph-based assembler, using different k-mer sizes. The different assemblies were combined and clustered using CD-HIT. The assemblies were evaluated using TRANSRATE for quality and filtering, BUSCO for completeness, and TRANSDECODER for annotation potential. The 25-mer assembly was annotated using PANNZER (protein annotation with z-score) and BLASTX. The 25-mer assembly represents the best first draft assembly since it contains the most information. However, this assembly shows high levels of polymorphism, which currently cannot be differentiated as paralogs or allelic variants. Furthermore, this assembly is incomplete, which could be improved by sampling additional developmental stages.
Resumo:
Spatial data representation and compression has become a focus issue in computer graphics and image processing applications. Quadtrees, as one of hierarchical data structures, basing on the principle of recursive decomposition of space, always offer a compact and efficient representation of an image. For a given image, the choice of quadtree root node plays an important role in its quadtree representation and final data compression. The goal of this thesis is to present a heuristic algorithm for finding a root node of a region quadtree, which is able to reduce the number of leaf nodes when compared with the standard quadtree decomposition. The empirical results indicate that, this proposed algorithm has quadtree representation and data compression improvement when in comparison with the traditional method.
Resumo:
Ontario bansho is an emergent mathematics instructional strategy used by teachers working within communities of practice that has been deemed to have a transformational effect on teachers' professional learning of mathematics. This study sought to answer the following question: How does teachers' implementation of Ontario bansho within their communities of practice inform their professional learning process concerning mathematics-for-teaching? Two other key questions also guided the study: What processes support teachers' professional learning of content-for-teaching? What conditions support teachers' professional learning of content-for-teaching? The study followed an interpretive phenomenological approach to collect data using a purposive sampling of teachers as participants. The researcher conducted interviews and followed an interpretive approach to data analysis to investigate how teachers construct meaning and create interpretations through their social interactions. The study developed a model of professional learning made up of 3 processes, informing with resources, engaging with students, and visualizing and schematizing in which the participants engaged and 2 conditions, ownership and community that supported the 3 processes. The 3 processes occur in ways that are complex, recursive, nonpredictable, and contextual. This model provides a framework for facilitators and leaders to plan for effective, content-relevant professional learning by placing teachers, students, and their learning at the heart of professional learning.
Resumo:
Ontario school principals’ professional development currently includes leadership training that encompasses emotional intelligence. This study sought to augment the limited research in the Canadian educational context on school leaders’ understanding of emotional intelligence and its relevancy to their work. The study utilized semi-structured interviews with 6 Ontario school principals representing disparate school contexts based on socioeconomic levels, urban and rural settings, and degree of ethnic diversity. Additionally, the 4 male and 2 female participants are elementary and secondary school principals in different public school boards and represent a diverse range of age and experience. The study utilized a grounded theory approach to data analysis and identified by 5 main themes: Self-Awareness, Relationship, Support, Pressure, and Emotional Filtering and Compartmentalization. Recommendations are made to further explore the emotional support systems available to school leaders in Ontario schools.
Resumo:
Indwelling electromyography (EMG) has great diagnostic value but its invasive and often painful characteristics make it inappropriate for monitoring human movement. Spike shape analysis of the surface electromyographic signal responds to the call for non-invasive EMG measures for monitoring human movement and detecting neuromuscular disorders. The present study analyzed the relationship between surface and indwelling EMG interference patterns. Twenty four males and twenty four females performed three isometric dorsiflexion contractions at five force levels from 20% to maximal force. The amplitude measures increased differently between electrode types, attributed to the electrode sensitivity. The frequency measures were different between traditional and spike shape measures due to different noise rejection criteria. These measures were also different between surface and indwelling EMG due to the low-pass tissue filtering effect. The spike shape measures, thought to collectively function as a means to differentiate between motor unit characteristics, changed independent of one another.
Resumo:
Despite recent well-known advancements in patient care in the medical fields, such as patient-centeredness and evidence-based medicine and practice, there is rather less known about their effects on the particulars of clinician-patient encounters. The emphasis in clinical encounters remains mostly on treatment and diagnosis and less on communicative competency or engagement for medical professionals. The purpose of this narrative study was to explore interactive competencies in diagnostic and therapeutic encounters and intake protocols within the context of the physicians’, nurses’, and medical receptionists’ perspectives and experiences. Literature on narrative medicine, phenomenology and medicine, therapeutic relationships, cultural and communication competency, and non-Western perspectives on human communication provided the guiding theoretical frameworks for the study. Three data sets including 13 participant interviews (5 physicians, 4 nurses, and 4 medical receptionists), policy documents (physicians, nurses, and medical receptionists) and a website (Communication and Cultural Competency) were used. The researcher then engaged in triangulated analyses, including N-Vivo, manifest and latent, Mishler’s (1984, 1995) narrative elements and Charon’s (2005, 2006a, 2006b, 2013) narrative themes, in recursive, overlapping, comparative and intersected analysis strategies. A common factor affecting physicians’ relationships with their clients was limitation of time, including limited time (a) to listen, (b) to come up with a proper diagnosis, and (c) to engage in decision making in critical conditions and limited time for patients’ visits. For almost all nurse participants in the study establishing therapeutic relationships meant being compassionate and empathetic. The goals of intake protocols for the medical receptionists were about being empathetic to patients, being an attentive listener, developing rapport, and being conventionally polite to patients. Participants with the least iv amount of training and preparation (medical receptionists) appeared to be more committed to working narratively in connecting with patients and establishing human relationships as well as in listening to patients’ stories and providing support to narrow down the reason for their visit. The diagnostic and intake “success stories” regarding patient clinical encounters for other study participants were focused on a timely securing of patient information, with some acknowledgement of rapport and emapathy. Patient-centeredness emerged as a discourse practice, with ambiguous or nebulous enactment of its premises in most clinical settings.
Resumo:
Digital Terrain Models (DTMs) are important in geology and geomorphology, since elevation data contains a lot of information pertaining to geomorphological processes that influence the topography. The first derivative of topography is attitude; the second is curvature. GIS tools were developed for derivation of strike, dip, curvature and curvature orientation from Digital Elevation Models (DEMs). A method for displaying both strike and dip simultaneously as colour-coded visualization (AVA) was implemented. A plug-in for calculating strike and dip via Least Squares Regression was created first using VB.NET. Further research produced a more computationally efficient solution, convolution filtering, which was implemented as Python scripts. These scripts were also used for calculation of curvature and curvature orientation. The application of these tools was demonstrated by performing morphometric studies on datasets from Earth and Mars. The tools show promise, however more work is needed to explore their full potential and possible uses.